Down With Cookie Walls, Give Us the Web Privacy API!

Google, Facebook, other advertisers profile us and invade our privacy in all ways imaginable. Lawmakers are always behind, cookie walls and opt-out schemes don’t do much, except ruining the web. Is there a better way? How about a new, GDPR-enforced Web Privacy API? This article lays out a design for an alternative, in form of an RFC.

A few days ago Google announced that it's killing FLoC. The invasive tracking technology, announced in 2021, met with widespread opposition, is finally being dropped, but only to be replaced with the Topics API. This new proposal is technically different, but it still aims at the same goal: 

Key Design Guidelines for Building Privacy-Friendly Applications

The discussion around digital rights is still ongoing, but while it is still not a guaranteed right, many users still greatly value their privacy. If you are looking to develop privacy-friendly applications, here are some design principles to keep in mind.

1. Restraint

While some amount of data collection may be necessary for your application to function, it’s important to make sure you are not collecting any more data than is strictly necessary. The less user data your application collects, the easier it’ll be to keep that data safe. And collecting as little data as possible also makes it easier to follow the other principles in this guide.

How to Leverage 9 Digital Product Development Principles to Design and Build Successful Products

According to a 2018 US Chamber of Commerce report, 84% of the small businesses use at least one digital platform to promote their business, while 79% use digital tools to engage with customers and suppliers.

Today, the estimated annual global digital ad spending stands at a staggering $332.84 billion. Closer to home, Americans purchased a whopping $586.92 billion worth of products and services online. 

How to Protect Dataset Privacy Using Python and Pandas

Working with datasets that contain sensitive information is risky, and as a data scientist, you should be extremely careful whenever this type of data is present in a dataset. People dealing with sensitive information are often under the misunderstanding that by removing names, ID’s, and credit card numbers that the privacy risk is eliminated. While removing direct identifiers can help, there are more information elements in a dataset that can be used to re-identify an individual. For example, Latanya Sweeney, Director of the Data Privacy Lab in the Institute of Quantitative Social Science (IQSS) at Harvard, proved that 87 percent of US population can be re-identified using zip code, gender, and date of birth.

In this post, I am going to show you how to effectively reduce the privacy risk of a dataset while maintaining its analytical value for machine learning.