Your Questions This Week
#NEWSLETTER | We tell our kids to never fear asking questions and that is how we all should approach the years ahead... so rounding out the week with some good ones asked this week...
Now more than ever we need to feel empowered to ask the big sticky questions. The shift in the world to something more unstable, and to a new business model that requires copious amounts of our data, means none of us can be sure where we go from here. But the good news is that we are talking, and here are some of what friends, colleagues, media and, even strangers, chatted about with me this week.
Why is Surveillance a Hot Topic? What’s Changed?
Say surveillance and many of us think: cameras to catch criminals, or maybe creepy spying, snooping, and the like. In some places, such as the UK, CCTV is part of life. In others, like NYC, there are no hard and fast laws about cameras, except in places where you would have the common “expectation” of privacy (bathrooms, locker rooms, and so on). But now cameras and footage are about data. Really valuable data. Your face is a complex arrangement of features and expressions as unique to you as a fingerprint and any video, image or other data captured of your face, body, and very being is incredibly valuable and could be used for ill. So we need to start talking about how to protect our visual data.
Is the TikTok Issue About Surveillance?
Sort of. Unfortunately the messaging around TikTok overall has gotten muddled, because *the* thing is the collection, storage and use of our video and image data. Your face, like your finger and palm print, is valuable and highly sensitive “biometric” information. TikTok states clearly in its privacy policy (since it’s already been sued for this very thing) that it does, in fact, collect biometric data; but then gets opaque and non-committal, about how it’s shared (e.g., maybe between companies, possibly for research, etc). But what our government does know is that the data is in easy reach of the Chinese Communist Party and that is a line we should all not want to cross.
How Can Facial Data be Used for Ill?
China is the world’s broker of facial recognition technology; bolstered by its long term mass surveillance on its people. The country has refined this capability, born of computer vision technology, over decades. It can be used to steal (biometrics are now often used to access your banks, devices, etc), to create deepfakes (already a reality as I wrote about previously), to extort, bribe, and embarrass — you name it. Personally, the biggest disappointment in all of this waffling around TikTok is how simple the facts are and how stark this new reality is for us. It’s not a secret either. You can read it straight from the Director of National Intelligence’s annual Threat Assessment report. “Our adversaries increasingly view data as a strategic resource,” is just one of the simple phrases that gives me chills.
Aren’t US Social Media Companies Just as Bad?
Yes and no. Yes, if you think they are using your data to build AI applications, but no, because they are definitely accountable for the collection, use, and storage of this data. We should be reading the policies, asking questions publicly, and if we are uncomfortable with what we are finding, we should talk to our elected representatives. That is the way it works. Big Tech will continue to build the innovation we want to have, and it’s on us to hold them accountable in the ways in which we do this. We don’t have this same ability to force accountability when it comes to foreign governments and that is the problem. One of the very best pieces I’ve read explaining this distinction was in the The Atlantic this week. It’s worth a read.
Should I Rethink How My Data is Shared?
Yes, it’s a valuable asset! And it belongs to YOU. It’s not all doom and gloom. We need to see how exciting it is that we all hold this new power. The chart below is from a recent Goldman Sachs report on the “power” of data in the AI ecosystem. The graphic says it all (with a little commentary by me overlayed of course). If our data is in such great demand, let’s stop handing it out like candy!
What’s Your Concern About Social Media Legislation?
Of course, we know that our kids have been negatively affected by social media. They are easily lured in and hooked on the connection, voyeurism, and customization of feeds driven by corporate need for engagement and advertising revenue generation. But the simple fact of every piece of legislation looking to control the time kids spend on these platforms is that it can’t be done without handing over more of our kids data. From Utah to Arkansas, and now New York, any law attempting to control kids’ use of social media requires “proof”…or data. In Utah that’s meant uploading an ID, in New York it’s puts trust in companies to use the right solution. But if you are going to have to defend yourself in court, then you must be as specific and accurate as possible in proving you have or haven’t violated any age-related laws. And the idea of using any additional data — in particular, facial recognition data, is a hard no from me.
Isn’t this All Too Big and Too Far Gone?
No! Let’s create a little natural friction. If you don’t need to share it, then don’t. You don’t have to accept cookies, or share your actual birthday (unless of course it’s a bank, hospital, government entity). You can answer “I prefer not to say” on forms, or avoid using image filters on social media apps, and so on. If you think of every piece of your data as a gem or a bag of money, you might research what you are handing over. Read the privacy policies too! After a couple you’ll see a pattern and know what you are comfortable with and what is a red flag.
What Are You Excited About?
Well, I added that question, because I want to tell you! I’m excited that we are starting to talk about these issues. This week I was invited on a new show: NY Insider hosted by Liam O’Neill. He’s a fantastic host in search of sensible, thoughtful ideas and commentary on new ideas and issues in New York (I’ll share the piece when live). I’ve also written an op-ed on the subject of Google and AI on the heels of my interview with the smart, thoughtful voice of families and education, Deb Fillman. And finally, with my Families for New York partner-in-crime, Danyela Souza Egorov, we dove into the subject of protecting student data in another op-ed last week.
Have ideas or question? Please get in touch!
.