Macy's calls BIPA unconstitutional as biometric privacy issues expand globally, the power of Mine AI and what to make of Iowa's CDPA
View in browser
newsletter-header

Data Privacy Happenings

**Hello and welcome to MineOS's monthly newsletter, The Privacy Mindset! đź‘‹

 

One of the major developments for data privacy regulation in the next few years will be how governments and regulators attempt to follow and amend legislation to cover biometrics more comprehensively. 

 

With the rise of AI tools, we are sure to see an influx of vocal and image deepfakes, and not just the fun kind where AI does charming hip hop covers in Kanye West’s voice.


The majority of people have quite a few photos and voice recordings floating around the internet thanks to social media, making many, many people vulnerable to being deepfaked. This makes policing biometric data critical to keeping people’s data rights safe. 

 

The most powerful law on the books in the United States at the moment is Illinois’ Biometric Information Privacy Act, which has had multiple high-profile cases, including a recent ruling against White Castle that could run the burger chain a $17 million fine.  

 

As one of the few data privacy laws that has any real weight to it, it’s unfortunate to see BIPA frequently under fire from businesses. The latest offensive comes from Macy’s, the retail giant, which finds itself embroiled in a class action lawsuit in the Illinois court system titled Carmine v. Macy’s Retail Holdings, Inc.

 

Macy’s is alleged to have gathered video surveillance of thousands of customers without consent and sent the images to Clearview AI for processing through its facial recognition software.

 

In response to the lawsuit, which has now progressed several tiers through the Illinois court system despite Macy’s attempts to have the case dismissed, the retailer has publicly called BIPA unconstitutional due to “grossly excessive damages wholly unrelated to any actual harm, while at the same time making damages discretionary without any basis for how courts and juries should weigh this discretion.”

 

That is quite the hardline stance, and one that too many companies are taking as they posture against data privacy regulations (see BetterHelp’s statement after their FTC fine for improperly handling customer data). 

 

The fight over biometrics is likely to become a focal point over the next few years, and BIPA, as part of the current vanguard in the U.S., will need to prevail in these cases as they appear. 

Product Spotlight

MineOS AI automatically suggests data types and processing activities for the data sources in your data inventory, with up to 85% accuracy and requiring no integrations.

 

It works by leveraging machine learning capabilities to analyze public information like privacy policies and legal documentation, feature guides, terms of service and context on your own business (company size, industry, etc.) to produce accurate results, as well as a solid baseline for your RoPA.

Regulation Focus

ICDPA 715D.1.5

A "child" means any natural person younger than thirteen years of age.

 

One of the most interesting current regulatory debates is over how to define children’s age in terms of data privacy. What’s the cutoff for when children can share consent to data processing activities without parental guidance?

 

Iowa is the most recent American state to pass data privacy regulation, and the Hawkeye state has followed in the footsteps of California, Virginia, Colorado and others in setting the bar at 13 years of age. 

 

This is in stark contrast to the GDPR, which sets the bar for children as under 16 years old and is more in line with what privacy advocates are calling for to help protect children’s data.

 

Utah, currently one of the only six states that has comprehensive data privacy regulation in the U.S., is looking to push this even further with a new law set to take effect in March 2024 that requires parental consent for children under 18 to fully access social media platforms at any time of day. 

 

How this will interact with the UCPA is not fully clear, but it creates an interesting new development to how children’s data will be governed, and hopefully will bump the threshold up from 13 as future states pass bills.

Galgo

Founder's Corner 

CTO & co-founder Gal Golan

Q: A major part of MineOS is providing automation for tasks like: data inventory, handling privacy requests etc. Some people are reluctant about using automation, so how does Mine address these concerns?

 

A: From our experience, there are several reasons why some people may not trust software automation, and I’ll break down how we approach them to address the concerns:

 

1. Inadequate understanding.

 

Some individuals may not fully understand how software automation works and may view it as complex or requiring human judgment. This is especially true when people believe they have a complex tech stack or a unique use case.

 

While this is true in some rare cases, most of the time, their use case and architecture are similar to other customers we have seen, and it means our platform and automation already support their use cases out of the box.

This should not come as a surprise, as one of the key advantages of engineering is the ability to reuse and build upon existing designs and solutions.

 

2. Negative past experiences.

 

Some people may have had negative experiences with automation in the past, such as software glitches or errors or an unsatisfactory experience with another vendor in this space leading to a loss of trust and skepticism.

 

To address such concerns, we provide a free proof-of-concept (POC) offering that enables potential customers to test-drive our platform and explore its features. This allows them to gain first-hand experience and address any doubts they may have.



3. Safety and security worries.

 

Some people may have concerns about the safety and security of automated systems, especially in industries where data is an important part of the business and data loss has a huge impact on the business.

 

MineOS was built with security and safety in mind, and that manifests itself in a few principles we follow when building automation:

  1. Limited access, permissions and scope: our integrations use the minimal set of permissions required only to perform their task. We also limit the scope and APIs used to perform these tasks.
  2. Audit logs - A full activity log is kept with everything the automation is doing so we can always review it later.
  3. Understanding the business logic - We build automations only after understanding the business logic of the target system and task being automated. For example, when implementing a data deletion automation to respond to DSRs, we don't just implement a piece of code that deletes data, we make sure that piece of code is aware of business requirements. For example, automations for payment or e-commerce systems will retain invoices instead of deleting them. Integrations into customer support systems will abort a deletion if the customer has an associated open support ticket, and so on.
  4. Simple setup - the more configuration and settings required to set something up, the higher the chance for human error and something going wrong. This is the reason why all our integrations are ready to go with “zero-configuration”.

Upcoming Events

Join us Thursday, May 4 at 2 PM GMT for the latest in our customer webinar series. We'll be joined by Ledgy's Information Security Program Manager Aengus O'Donoghue to explore the intersection of data privacy and security. Mark your calendars now!

 

We're always around

to talk data privacy.

Get in touch at press@saymine.com 

flybook

How did you like this month's issue?

Let us know

SayMine Technologies Ltd., 94 Igal Alon st., Alon 1, Tel Aviv, Israel, 6789155

Unsubscribe Manage preferences