Intel and Partners Launch Private Artificial Intelligence Collaborative Research Institute

By Jemima McEvoy Saturday, December 19, 2020

Changes to the workforce caused by the coronavirus pandemic have opened up new and unique opportunities for entrepreneurs. More Americans than ever before are restricted to their homes, turning bedrooms, living rooms, or dining rooms into temporary offices for “work from home.” Numerous top companies — including Dropbox, Facebook, Okta, Shopify, Square, Twitter, and Zillow — have signaled that they may never return to a traditional office setting.

A trio of leaving technology companies have joined together in hopes of creating a force for good. Announced December 3, global tech leader Intel is joining with Avast and Borsetta to form the Private AI Collaborative Research Institute. The goal of the newly ramped up institute is singular: to advance and develop technologies that strengthen privacy and trust for decentralized artificial intelligence (AI).

The Partnership

Everyone knows Intel, but its partners on this project are more widely known within the tech world. Avast is a global leader in digital security and privacy products. It is a Financial Times Stock Exchange 100 Index company with over 400 million users online, offering products under its Avast and AVG brands that are designed to protect people from threats on the internet. Its threat detection network is one of the most advanced in the world, and relies on machine learning and AI. Borsetta, on the other hand, is a self-described AI software-defined secure computing hardware services company that specializes in enabling secure, trusted, and decentralized AI and private computation.

Executives from both companies expressed excitement to be working with Intel in an announcement released the day the institute was formally introduced. “We're delighted to be joining forces with Intel and Borsetta, to unlock AI's full potential for keeping individuals and their data secure,” said Avast’s Chief Technology Officer Michael Pechoucek, while Borsetta’s CEO Pamela Norton touted the Private AI Collaborative Research Institute as having a mission “aligned with our vision for future proof security where data is probably protected with edge computing services that can be trusted.”

The Vice President and Director of Intel Labs Richard Uhlig also said Intel is “excited” to have both Avast and Borsetta on board “to mitigate potential downsides and dangers of AI.”

The Goal

The institute has a clear goal in mind, which has become increasingly vital as technology has advanced, latching onto a new and promising new trend in AI: decentralization. Decentralized AI gives companies controlling large datasets the opportunity to be independent, a trend that’s been supported by the development of Blockchain technology. In these decentralized ecosystems, consumers, providers, data scientists, and more can collaborate to create AI without a centralized control authority.

"AI will continue to have a transformational impact on many industries, and is poised to become a life-changing force in healthcare, automotive, cybersecurity, financial and technology industries,” explained Uhlig. “That said, research into responsible, secure, and private AI is crucial for its true potential to be realized. The Private AI Collaborative Research Institute will be committed to the advancing technologies using ethical principles that put people first and keep individuals safe and secure.”

The institute, like many others in the industry, argues that the current models are limiting performance. “For example, health data is siloed and cannot be used for centralized training due to privacy and regulatory constraints,” explains the institute’s website. “Autonomous cars generate terabytes of traffic data where bandwidth prevents centralized training. Personal computers and phones in billions of homes generate vast amounts of data daily, which cannot be uploaded due to privacy concerns.” However, some oppose the advancement of this technological trend as they argue that diminished regulation may lead to privacy oversteps.

The Plan

With increased collaboration and a move away from cloud infrastructure, the Private AI Collaborative Research Institute is drawing ideas from around the world to encourage the development of AI technologies that will respond to pressing, real-world problems. The hope is that data will be liberated from silos and protected securely. For it to be effective, though, huge amounts of data that is most often sensed at the edge — like vehicle routing, industrial monitoring, security threat monitoring, and more — are required.

The institute put out a call for research proposals early in the year, according to its website, and has chosen nine research projects to pursue. These projects come from top research institutes across the globe: Carnegie Mellon University (United States); University of California, San Diego (US); University of Southern California (US); University of Toronto (Canada); University of Waterloo (Canada); Technology University of Darmstadt (Germany); Université Catholique de Louvain (Belgium); and the National University of Singapore.

Depending on its success, the institute could help push AI forward by leaps and bounds.

About the Author


Headshot for author Jemima McEvoy

Jemima is a journalist who enjoys reporting on business, particularly small business and entrepreneurship.

Related Articles