How AI Could Hurt Your Job Search & Other News
#NEWSLETTER | The speed of AI innovation output right now is only matched by fervent government accountability efforts directed at Big Tech ... but whose best interest are we actually focused on?
ChatGPT’s parent OpenAI introduced little sister “Sora” which can create video from text instruction. “A new application before I’ve even dug into the text product,” you ask? Yes! But it’s important to pay attention, even just conceptually, because of the implications for entire industries (not the least of which, Hollywood), as well as, the risk deepfakes will pose to us all. Right now Sora is text-to-video only and not widely available to users. Thankfully, OpenAI is taking the associated risks seriously and we should consider the bigger picture here too — especially for our kids.
New York City announced a lawsuit against Meta, Google, Snap and Byte Dance (Tik Tok) last week and yet at the same time the agency leading the charge (NYC Health & Hospitals) spent tax dollars in 2023 to advertise to kids as young as 13 years-old on Facebook, Instagram and Snapchat. How can we ask children to trust us if we set such a poor example for them to follow?
The College Board was found liable for making millions on teenagers’ data. The College Board is a nonprofit that, amongst other things, administers the Advanced Placement exam. There have been a smattering of suits related to the privacy of the data they hold on kids, but it’s the suit in New York that stuck most recently. Unfortunately, considering the millions of dollars made selling the information of 250,000+ secondary school students to at least 1,000 companies, there hasn’t been nearly enough news about the case — not to mention the paltry settlement sum of $750K.
Could AI tools be hurting your job search? Despite efforts to force transparency in the use of AI tools by companies to hire talent, it may be going a bit haywire at your expense. Because, while you might have found AI assistance in writing cover letters or perfecting your resume, getting caught up in AI’s algorithms may mean your information never even gets to a human.
This Week’s Tips & Tricks
(Each week I’ll share ways to engage and educate kids, protect our data, and embrace the opportunity of new technology. Have an idea? Let me know!)
✅ Watch those cameras. After spending time reviewing a few privacy policies with my kids, including streaming services such as Netflix, Hulu, and Disney, my daughter said “glad I cover the camera when I watch TV in the shower.” Now leaving aside this ill-advised habit, it’s important to check to see what apps have access to your smart device and it’s microphone and video camera. Now, it’s not about being spied on per se, but absolutely if you give access apps, companies will make ample use of it. For iphone users go to Settings » Privacy & Security and then you have the option to turn toggle off access by tool. Scroll to the bottom and via App Privacy Report you can review who accessed what (now note, it looks scary but it’s primarily cookies).
✅ Try the new ChatGPT “marketplace.” Chatbots are often more exciting in concept than reality, as it requires you to think through what you might use them for. And that is exactly why ChatGPT’s new “marketplace” is exciting and an obvious next step to consumer adoption. What does it mean? It means that if you are hung up on the question: “what would I use ChatGPT for…” well, the work is done for you. Need help organizing a trip? There’s a GPT for you. Math help? There is a focused GPT for that too. I created one that answers any question you may have with an appropriate 1980s song lyric and a tongue-in-cheek answer in response 😜… think of it as a fun risk-free way to start considering more *serious* applications!
❌ Don’t answer every question on kids’ applications. You may be in the thick of helping your children fill out applications for summer activities right now — from camp to internships and more. Remember that not every question must be answered. This is not to suggest that we mislead but instead a suggestion to decline where possible. Why? Even the well-intended right now are struggling to protect kids’ data. Also we can’t always be sure that even those organizations we trust are aware of loopholes that can be exploited. A great example is this story I wrote about NYC’s partnership with Talkspace. So, if in doubt and it’s not essential, then leave it out!