Mayo Clinic Works with TripleBlind to Streamline Diagnosis Process

We were excited this week to announce a new collaboration between TripleBlind and the Mayo Clinic. Mayo plans to use TripleBlind’s solution to validate interoperability of encrypted algorithms on encrypted data and training of new algorithms on encrypted data. Our solution will enable Mayo and other health care systems to generate insights from highly regulated data without actually accessing the data – ensuring compliance with HIPAA and other standards.

Today, health care systems have to either transfer data or algorithms outside their institution for experts to train or conduct research. This process can take many months and typically involves complicated legal contracts and a significant amount of time from technicians. Our solution eliminates this step, further protecting intellectual property.

Suraj Kapa, M.D., a practicing cardiologist and director of AI for knowledge management and delivery at Mayo Clinic, noted:

“Training novel algorithms on encrypted data sets and facilitating trust between independent parties is critical to the future of AI in medicine. By using advanced mathematical encryption technologies, we will greatly enhance scientific collaboration between groups and allow for more rapid development and scalable implementation of AI-driven tools to advance healthcare.”

TripleBlind’s advanced mathematics toolset will allow Mayo to improve the current process by enabling Mayo technicians to perform diagnostic services using data wherever it resides. The collaboration focuses on three milestones. The first milestone includes validating that an algorithm already created and “trained” at Mayo can be delivered to remote hospitals and used locally to achieve accurate diagnoses. This milestone will also demonstrate that TripleBlind’s toolset provides accurate information and effectively enforces appropriate data privacy regulations such as GDPR, HIPAA, data residency regulations and other standards.

The second milestone will demonstrate that TripleBlind’s toolset can be applied to train an algorithm that resides at Mayo to access data remotely and provide diagnostic services. The third milestone will consist of demonstrating that TripleBlind’s solution can support any type of medical data and potentially genetic data in particular. Genetic data is especially difficult to aggregate since almost by definition it is impossible to deidentify using conventional approaches.

We are eager to work together with the Mayo Clinic to assist them in providing more and better services to health care systems around the country.

This news follows TripleBlind’s recent announcement it has received funding and marketing support from Accenture Ventures, the investment arm of global professional services company Accenture (NYSE: ACN). TripleBlind is applying this funding to further refine our solution.

We wish our colleagues, partners, investors and other friends of the company the best for the holiday season. We will have another announcement highlighting our growth in early January, please check back then.

Mayo Clinic Press Release

TripleBlind Collaborates with Mayo Clinic on Next Generation Algorithm Sharing and Training on Encrypted Data

KANSAS CITY, Mo., Dec. 15, 2020 (GLOBE NEWSWIRE) — TripleBlind announced today it is collaborating with Mayo Clinic researchers who will use TripleBlind tools to validate interoperability of encrypted algorithms on encrypted data and the training of new algorithms on encrypted data.

TripleBlind has created a rapid, efficient and cost effective data privacy focused solution based on breakthroughs in advanced mathematics, which will be used and validated by the Mayo team. No Mayo data will be accessed by TripleBlind.

Today, health care systems have to either transfer data or algorithms outside their institution for experts to train or conduct research. The encryption conduit being evaluated will eliminate the need for data transfer or for sharing the algorithm, thus protecting intellectual property. TripleBlind’s solution functions as the innovative data encryption conduit that keeps the data and intellectual property in the algorithm secure.

“We hope to demonstrate the potential of applying TripleBlind’s data privacy and data clean room solution to accelerate how we develop, test, and deploy AI solutions in healthcare, particularly amidst heavily regulated privacy concerns,” said Riddhiman Das, co-founder and CEO of TripleBlind. “We are eager to work together with Mayo Clinic to explore TripleBlind’s encryption tools and showcase real-world applications.”

The goal of this collaboration is also to demonstrate that TripleBlind’s toolset can be applied to train entirely new algorithms from independent entities anywhere in the world without the need to share raw data, thus preserving privacy and security while meeting regulatory standards.

“Training novel algorithms on encrypted data sets and facilitating trust between independent parties is critical to the future of AI in medicine. By using advanced mathematical encryption technologies, we will greatly enhance scientific collaboration between groups and allow for more rapid development and scalable implementation of AI-driven tools to advance healthcare,” said Suraj Kapa, M.D., a practicing cardiologist and director of AI for knowledge management and delivery at Mayo Clinic.

Mayo Clinic and Dr. Kapa have financial interest in the technology referenced in this release. Mayo Clinic will use any revenue it receives to support its not-for-profit mission in patient care, education and research.

About TripleBlind

TripleBlind’s patented breakthroughs in advanced mathematics arm organizations with the ability to share, leverage and monetize regulated data, such as PII and PHI, and mission-critical enterprise data, such as tax returns and banking transactions. It unlocks the estimated 105 petabytes of data stored by enterprises today that are inaccessible and unmonetized due to privacy concerns and regulations. With TripleBlind, decision makers generate new revenue for their organizations by gaining deeper insights faster, creating improved modeling and analysis, and collaborating more effectively with customers and partners and even competitors, without compromising safety.

For more information, please visit tripleblind.ai.

Contact

Victoria Guimarin
UPRAISE Marketing + Public Relations for TripleBlind
tripleblind@upraisepr.com
415.397.7600

Accenture’s Press Release

Accenture Makes Strategic Investment in TripleBlind to Bolster Data Privacy and Increase Data Collaboration Opportunities

Startup joins Accenture Ventures’ Project Spotlight program to accelerate go-to-market strategy

NEW YORK; Nov. 18, 2020 – Accenture (NYSE: ACN) has made a strategic investment, through Accenture Ventures, in TripleBlind, a data privacy and virtual clean room solution provider.

Sharing and extracting insights from sensitive data, such as patient records or tax returns can help accelerate cures for diseases or reduce financial crimes, yet, concerns around data privacy and governments regulations have prevented many companies from identifying these insights to date. TripleBlind helps enterprises share sensitive information with their stakeholders more effectively – without ever decrypting the data, helping them comply with regulatory requirements.

“Organizations can yield valuable insights and unlock trapped value by combining and collaborating around large volumes and different types of data, but in order to do this they need to trust that the privacy of that data is protected,” said Shail Jain, global lead for Data & AI Group for Accenture Technology. “We believe that TripleBlind not only has the capabilities to facilitate collaborative data exchanges, but to also give organizations confidence that data privacy remains intact.”

TripleBlind is now part of Accenture Ventures’ Project Spotlight, a deeply immersive engagement and investment program that targets emerging technology software startups to help the Global 2000 embrace the power of change and fill strategic innovation gaps. Project Spotlight offers extensive access to Accenture’s deep domain expertise and its enterprise clients, to help startups harness human creativity and deliver on the promise of their technology. Through the program, TripleBlind will co-innovate with Accenture at its Innovation Hubs, Labs and Liquid Studios, working with subject matter experts to bring its solutions to market more quickly and more effectively.

 

Accenture invests in TripleBlind, a data privacy and virtual clean room solution provider.

 

“The global market for big data and business analytics is projected to reach more than $500 billion by 2026. As that market grows, the pressure within enterprises to share data to uncover new revenue opportunities and gain competitive advantage will grow as well,” said Riddhiman Das, co-founder and CEO of TripleBlind. “TripleBlind’s next-generation cryptographic, efficient and scalable data privacy and virtual clean room solution can replace ineffective workarounds like complex legal contracts, data anonymization or deidentification, and other technologies such as homomorphic encryption, while helping to avoid regulatory statutes and data residency violations. Accenture Ventures’ investment and our participation in its Project Spotlight program will advance and accelerate our ability to help enterprises harness the potential of sensitive data.”

Tom Lounibos, managing director, Accenture Ventures, added, “Our investment in TripleBlind demonstrates Accenture Ventures’ commitment to cultivating the latest technologies, enhanced by human ingenuity, that solve for our clients’ most critical business needs. We believe that TripleBlind offers our clients a key to enhancing data privacy while ensuring regulatory compliance – a major challenge in today’s environment.”

TripleBlind is the latest addition to the investment portfolio of Accenture Ventures, which is focused on investing in companies that create or apply disruptive enterprise technologies.

Terms of the investment were not disclosed.

About TripleBlind

TripleBlind was founded in 2019 and is headquartered in Kansas City. It’s new data privacy and virtual clean room solution enables enterprises to leverage regulated data, such as PII and PHI, and enterprise-sensitive data without violating GDPR, HIPAA, the California Consumer Privacy Act (CCPA), data residency and other standards. For more information, please visit tripleblind.ai.

About Accenture
Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Interactive, Technology and Operations services—all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 506,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com.

Contact:
Christina McDonald
Accenture
+1 415 537 7997
christina.mcdonald@accenture.com

Victoria Guimarin
UPRAISE Marketing + PR for TripleBlind
+1 415.397.7600
tripleblind@upraisepr.com

Stay Tuned! Leading Global Professional Services Company Invests in TripleBlind

In about 10 days, we will have a significant announcement. The VC arm of a leading global professional services company will invest in TripleBlind and greatly accelerate our growth by connecting us to their ecosystem of companies and expertise.

Our new investor was eager to enter the data privacy space and organized a global review of the companies in the space. After a thorough review, they chose TripleBlind, noting that our solution is much more effective than the workarounds available today – complex legal contracts, data anonymization/de-identification and cumbersome alternative technologies, such as homomorphic encryption. For an example of how our solution works, please have a look at our recent blog.

Some of the obvious problems with these workarounds include:

  • Complex legal contracts often contain terms that are limiting and/or include liability clauses that are so onerous that companies are unwilling to enter into them, thus negotiations are typically expensive and require months to complete, despite the potential collaboration benefits
  • Data anonymization/deidentification is hard to execute well. It is ineffective if it doesn’t meet three criteria: individualization (it must not be possible to identify an individual), correlation (it must not be possible to cross-check multiple data sets to identify an individual), and inference (it must not be possible to deduce information about an individual from the data set)
  • Other technologies: Homomorphic encryption is a slow process that dramatically taxes compute performance. Differential privacy introduces inaccuracies to the eventual calculation. Secure enclaves are hardware dependent, do not support AI/ML at scale, are susceptible to known hacker attacks and not easy to update.

Shortly after announcing this large investor, we plan to announce new funding by several experienced VC firms and veteran angel investors. In fact, this pre-seed round was oversubscribed.

What is generating so much interest in our solution? Today, IBM estimates that enterprises fail to utilize as much as 93% of their data to generate incremental revenue by collaborating with customers, partners and even competitors. This is typically regulated data, such as PII and PHI, or mission-critical enterprise data such as tax returns and banking transactions. Our solution enables enterprises to unlock the information and insights within this data (note I didn’t state the data itself) while remaining in compliance with GDPR, HIPAA, CCPA (California Consumer Privacy Act) and similar regulations.

At TripleBlind, we think the better answer is mathematically enforced cryptography that doesn’t rely on laws or outdated technology, but rather privacy that is built into the protocol. We are working to “build in” privacy preservation and we want to give enterprises the keys to either lock or unlock their data as they see fit.

We are excited to apply these investments to complete product development and testing, and bring our solution to the over $500 billion data analytics market.

Stay tuned for upcoming announcements of major new customers!

How TripleBlind can Help Produce Better Models

The Power of TripleBlind Technology as demonstrated through the Blind Learning Model

It’s an age-old rule of AI: an algorithm is only as good as the data it is trained on. If the training dataset is small or biased in some way, the model will not be able to produce accurate results when new scenarios are presented. To ensure a model’s accuracy and effectiveness, it is imperative that the model be trained on a large quantity of accurate and unbiased data. However, this is often difficult to accomplish due to data scarcity. Even if an organization is able to acquire a large amount of data, this data is likely skewed in some way (location, ethnicity, gender, etc.) as it only comes from the organization’s own customers and/or users. Therefore, the organization’s models, trained only on its own data, are not universally applicable. This poses a large problem for any organization that wishes to deploy an algorithm for wide scale use. There is, however, a very clear solution: find a way to train the model across multiple datasets, provided by other organizations. 

Here, we present an example of how the TripleBlind technology can do just that: solve the problem of data scarcity while maintaining privacy.

Example with Publicly Available Data

To demonstrate the power of TripleBlind technology, we developed a test case with the following characteristics. We used portions of the MNIST* dataset held in two different organizations to train a convolutional neural network (CNN). 

After separating 29,000 MNIST images into 2 datasets of 6,000 images and 23,000 images, we handed ownership of each dataset to two different organizations, Organization A and Organization B. We first trained a CNN model using only Organization A’s dataset. The resulting model, trained with only Organization A’s 6,000 images, proved to be 91.60% accurate when tested against a separate set of 1,000 MNIST images. 

Then, using TripleBlind’s privacy ensuring technology, we used the same scripts to train a CNN model over the datasets owned by both Organization A and Organization B. Because we used TripleBlind’s platform, we were able to drastically increase the size of the training dataset, resulting in a CNN model trained over 29,000 images. We tested the model in the same way, against the same set of 1,000 MNIST images as before. The result was an accuracy of 96.10%. 

By using TripleBlind’s technology, we were able to both increase the size of the training dataset and increase the accuracy of the model. The training dataset increased from 6,000 images to 29,000 images (a 380% increase in size). This larger training dataset resulted in a model with increased accuracy- the new model was 96% accurate, as compared to the 91% accuracy of the model trained on less data.

Throughout the entirety of this process, privacy was ensured. Organization A was NOT able to see the data of Organization B. Likewise, Organization B was NOT able to see the data of Organization A. Neither organization could see the algorithm that was created. The training dataset was doubled, thus bettering the model’s accuracy, without compromising the privacy of either organization’s data. 

One should note that this example was conducted with evenly split data- the datasets were not biased in any way. In the real world, Organization A’s 6,000 images would likely be biased in some way. For this example, perhaps 90% of their 6,000 images would come from the numbers 1-5. Similarly, 90% of Organization B’s images might consist of the numbers 6-9. With these biased datasets, the accuracy differences between the first model and the model trained using TripleBlind’s platform would be much more drastic. 

*Modified National Institute of Standards and Technology

A Story of Privacy During the Pandemic

On February 27th we started something here at TripleBlind. After a conversation with Ramesh Raskar of MIT about an idea that struck him when Mike Pence’s team arrived at a healthcare conference asking for ideas of how to battle Coronavirus, we began building a contact tracing solution. (Even though we and nobody outside of certain specialists knew the term “contact tracing” back then.) This became the open source cellphone app known as Private Kit, which spawned the community behind the COVID Safe Paths app and the Safe Places system.

This system is good. It is it going to help stop the spread, it will #flattenthecurve. Even more importantly, I believe this is how we restart the world. I think having 14 days of “all green dots” is going to be the key to leaving our homes, opening our schools, rejoining our colleagues and feeling safe around strangers again.

But most importantly, this is Private.

Why is private important? It is an important sounding word, but is it really italics-worthy important? I mean, really, isn’t fighting the disease that is paralyzing the entire world more important? Can’t we forget Private for now? Let me tell you a story…

Several years ago there was a wildly contagious disease known as MERS (Middle East Respiratory Syndrome). It hit South Korea hard. Hard enough that people were scared and privacy was the last thing they cared about. A bill passed to help authorities perform contact tracing. To make it more efficient they were authorized to collect video, credit card and everyone’s cell phone location. The days preceding a patient’s diagnosis were almost perfectly reconstructed and published so the community could tell if they’d made contact with these people. It worked and MERS was stopped. So…that’s a good thing, right?

Here is the next chapter of that story. Recently, patient #15 with COVID-19 in a district of South Korea was broadcast — texted to the cellphones of everyone in a certain neighborhood. This text included a link to some facts. A female in her 20s working at Jacks Bar had been diagnosed. The trail of her last few days was published so people could tell if they’d had made contact with her. This was for the public good, a little loss of privacy is worth it — right?

Then the worst tendencies of scared people took over. The trail was examined…scrutinized by an online crowd. She’d gone home sick on March 27th. She went to eat at a restaurant on March 30th. She waited until April 3rd to go to the hospital. She even stayed with someone one night. How irresponsible! What a self-centered youth! And what about that person she’d stayed with? It was “anonymous” data, no names were ever published. But how many 20 year olds work at Jacks Bar? And how many of them live in that specific neighborhood?

People are storytellers. Points of data get connected and stories just naturally emerge. It is human nature.

Here is another version of that same story: A 20 something young woman starts to feel sick in the middle of a terrifying epidemic. ¹She leaves work early. Scared. Afraid she’s sick. Afraid she’ll lose her job. Afraid she won’t be able to pay her bills. After hiding for two days she is too weak to cook, but starving. She goes to get some food, hoping it will hold her over until it passes. She finally feels scared for her life and goes to the hospital to learn she is the victim of the virus. She was lucky and recovered. But this victim will forever be branded as the girl who jeopardized her neighbors. She’ll always be blamed for anyone who got sick, whether she actually came near them or not. The disease is gone, but she may never recover.

I’m not certain if my story is completely true. But it is why Private matters. Surveillance by a trustworthy official is easy to allow. Privacy is hard. But is it possible. And she deserved it. So does my daughter. So does my son. So do you.

This is why I care. This is why I’ve worked 18 hours a day, every single day for nearly two months. This is why TripleBlind exists.

#Covid19 #FlattenTheCurve #PrivacyMatters

 

partially inspired by segment at 7:40 of “The Coronavirus Guilt Trip

 

The Privacy “Faustian Bargain”

As many of you know I recently joined my good friend Riddihiman Das in an effort to build a cryptographically powered privacy system. We’ve been joined by a small team of experts and we are working hard to build an API that will enable bulletproof privacy as a service. Why does the world need “bulletproof privacy as a service”? I’m glad you asked! The short answer is because many of our most ubiquitous online services have developed business models that depend on surveilling us, and then “monetizing” (i.e. read “selling”) the data they accumulate. Data about us – some of which is deeply personal. 

The following was prompted by the November 20, 2019 issue of “The Download” (from the people over at MIT Technology Review). Today’s issue alone refers to at least four articles regarding the loss of privacy most of us suffer due to what the first article calls the “Faustian bargain” most users are forced to make. 

The first article, from Amnesty International is a “scathing indictment of the world’s dominant internet corporations” (https://apnews.com/7380c7d8a66443618223c5e86132cfae). The paragraph that caught my attention is “This ubiquitous surveillance has undermined the very essence of the right to privacy,” the report said, adding that the companies’ “use of algorithmic systems to create and infer detailed profiles on people interferes with our ability to shape our own identities within a private sphere.” The article then quotes Amnesty International as making a recommendation that is logical, but is unfortunately inadequate “Amnesty called on governments to legally guarantee people’s right not to be tracked by advertisers or other third parties. It called current regulations — and the companies’ own privacy-shielding measures — inadequate.” Good thought, but regulations won’t do it all. Too many legislative hurdles (i.e. read “lobbyists”), and in large parts of the world the local legal systems aren’t strong enough to enforce good regulations. At TripleBlind we think the better answer is mathematically enforced cryptography that doesn’t rely on laws, rather privacy is built into the protocol. We are working to “build in” privacy preservation, and we want to give you the keys to either lock or unlock your data as you see fit. 

The second article is about a home camera system and the many ways data (i.e. read “pictures of you, your friends and whoever walks past your house”) from these devices is used. https://arstechnica.com/tech-policy/2019/11/cops-can-keep-ring-footage-forever-share-it-with-anyone-amazon-confirms/ This article makes a couple points that caught my attention. First is the point about how the camera company shares data with other entities in ways that are not transparent to their customers (i.e. you and me). The second is the point that after the camera company shares that data with an undisclosed third party, the camera company no longer can control what happens to/with the data (i.e. read “undisclosed third party can use the pictures for whatever purpose or resale they want”). At TripleBlind we believe both of these positions are incorrect. We are working to build a system that allows you to control the release of the information in your data (note – I did not say “data”, I said “information in your data”) differentially – when, to whom, for what purpose, for what duration and at which price. 

The third article is what I think the authors meant to be a case of “surveillance for good”. I think most of us would say the goal (helping people with gambling disorders control their behavior) is a good one. https://www.bbc.com/news/technology-50486835 That said – think of the privacy implications of this application – especially when the behavior in question is coupled with a “frequent player” card/id. When the casino knows who you are, and that you display behaviors that their marketing department associates with being a “good customer” how do you think they will react? There is a very good chance the casino already knows individual customer’s gambiling behavior, and have tailored their marketing to that behavior. They are probably going to encourage you to visit the casino as often as possible. In the overall eco-system of individually targeted advertising intended to get customers in the casino – do you really think making some customers take a few second break will really make a difference? At TripleBlind we believe the better answer is to keep that “frequent player” card identity private, and allow the customer to control the dissemination of the data associated with it (and the advertising associated with it). 

Consider the fourth article a type of “public service announcement” from the folks at the Mozilla Foundation. https://foundation.mozilla.org/en/privacynotincluded/ It’s their “creepy rating” of various Christmas gift items. We can’t all go live in a cave or under a rock until better privacy tools arrive, but we can be vigilant and at least try to manage the privacy compromises we make everyday. 

In the meantime at TripleBlind we are working to deliver tools that will allow you to control your data, differentially release the knowledge in it and allow it to be interacted with algorithms in a way to protect both the data and the algorithm from disclosure. We believe this is going to be good for everyone. Once your privacy is cryptographically enforced you and the companies with which you do business will find even more interesting (and potentially lucrative) ways to use the ever larger and ever more granular data we produce every day. We might even find a way to change the terms of the privacy “Faustian bargain”.

I continue to believe this privacy thing is a big deal. 

Let’s Eat a Private Cake

After I left Ant Financial/Alibaba, I was filled with gratitude toward Ant Financial, Alibaba, and our global partners for 3 incredible years – they have been an absolute blast. No one is enabling global financial inclusion at the rate Ant Financial is, and I’m grateful to have gotten an opportunity to foster that. I worked in China, Israel, The USA, Canada, Colombia, Mexico, Brazil, India, Indonesia, Singapore, Thailand, Malaysia, the Philippines, South Korea, Hong Kong, Japan, Macau, Germany, Russia, The UK, Finland and several other countries. The work has grounded me and helped me understand how enabling global trust at the scale Ant does helps people self-actualize. I will forever be an Aliren.

As for what’s next for me – I am going to take a stab at building cryptographically powered privacy, without reliance on the legal system. This effort is called TripleBlind. We are building an API that will enable bulletproof privacy as a service, allowing the option to enforce privacy mathematically.

As more and more of our information is stored and transacted with in the digital world as opposed to the analog world, the current approaches we take to such private transactions fall short. The default approach is to slap some mumbo jumbo legalese into a privacy policy with the expectation that no one will ever read it. The evidence would suggest that these approaches don’t work – because they leave open the option to abuse the trust afforded to them by their end users.

The legal/contractual approach to privacy falls short for several reasons:

  1. It still leaves open huge holes to allow misuse of the data, intentionally or otherwise, both internally and externally. Breaking compliance requires just one incompetent or malicious actor in the entire organization. E.g. the major credit bureau using “admin/admin” as their credentials for their primary database. Or the major credit card issuer keeping all of their credit pull information in an unsecured S3 bucket.
  2. The custodians or owners of the data cannot consent to every operation performed on that data. While they might have the option to do so on paper, there’s no way to enforce it. It relies on the right organizational processes and structures in place, which are fallible, if they even exist. If the privacy policy is in the way of a particular operation, the data custodian can unilaterally change the privacy policy contract on the actual data owner. If you’re lucky, you might get an email at 3am telling you that the contract changed and you somehow already consented to it.
  3. The western world also has a tendency to take rule of law for granted. As we shift to a world where the vast majority of internet users are not from the western world, incumbent approaches that assume contracts can actually be enforced are inherently “broken”.

The core thesis around TripleBlind is that privacy enforced data & algorithm interactions can unlock the tremendous amount of value that is currently trapped in private data stores and proprietary algorithms. If we move from a world of “don’t be evil” to “can’t be evil”, we can enable entities to freely collaborate around their most sensitive data and algorithms without compromising their privacy, allowing them to work together to create compounded value in a way never before possible.

Around privacy, I believe we can have our cake and eat it too – let’s eat a private cake.