First submission to HN so could be the wrong kind of post, but I find it interesting that a platform that has had presumably millions in investments, boosts 30 million users (based on one of the founder's personal websites), and has dedicated engineers and staff can't be bothered to do the basics of ensuring marketing emails are sent out safely.
There's other posts about it on X, but they don't blur out emails so I won't share.
Well this is the platform that got kicked off Discord for refusing to delete user accounts in a timely manner, then training an AI model on user inputs...
Then tried to weaponize their userbase to mass email Discord over being kicked off.
And then struggled to manage Reddit...then scaled to as many platforms as possible (text, whatsapp, etc) then shut it all down then built out an entire API just to shut it down (apparently, can't confirm). They can't even get dark mode to work by default on their own website. Suspended from Twitter/X. Founder, Anush, making comments about "replacing engineers with AI" (source: https://x.com/anushkmittal/status/1979372588850884724), although that could be a joke, using their absolute dogshit "talk" platform.
And speaking of their chat platform...it's literally horrible. Slow to load, terrible UX, disjointed UI, no accessibility options, AI chatbots everywhere and you can't even tell their AI without clicking their profiles. It's like if Slack was made by a 12 year old.
Seriously, to put it in perspective I'm on a MacBook Pro M3 (so not that old) and a fiber, gigabit network. I click one chat and it can take up to 5 seconds to load the channel. JUST TO LOAD THE CHANNEL. It legit fires off like 30 fetch requests each time you load a channel. It's insane. I can't even blame NextJS for that, it's straight up them probably "vibe coding" everything.
I've had NixOS on a list to give it a shot one day when I wanna tinker on a weekend project, it's been on that list for quite some time now and just haven't had time for it. I hear good things about the project when I ask other devs, but all I read these days about Nix is about infighting, politics, global bans, governance wrongdoings, and drama.
From the outside, this concerns me for the longevity of the project. I'm sure if it makes it through it could be better for it; but it is concerning and makes me push it further down the list to try.
I worry though as politics and the way the org is run can impact the end user; maybe not today, but in a year. That's my concern with these style projects, there seems to be no real leader. Say what you will about Linus Torvalds, and there's a lot to be said, he really does care about the quality of the project.
Valid point tbh, if they said they were training off of data and never explicitly stated what data sources they were using then Discord should be concerned of that violation. Would also love to see that announcement from them about it. If the announcement was made in Discord then that only solidifies the reason for Discord to be concerned.
Which states:
"What’s more, community members have already interacted with Shapes enough to trigger millions of messages over the short, several month duration that the product has been in beta. We believe this head start in an emergent market will further enrich the conversation datasets which power Circle’s NPCs, and serve as a competitive moat over time."
This is an old post though, so it's philosophy could've changed, but even back then stating something like that is concerning. I do feel like it's worth calling out that the Discord developer policy did not explicitly state this until the 2024 policy, but it's been in effect since July 8th, 2024...so they had plenty of time to stop training their "shapes" on the user data before this happened and it seems they've been in contact with them before too so they could've just gotten clarification on it or just asked for permission.
Complete side-note it bothers me how they're using all these examples of people who use these "shapes" as emotional support and basically therapists as a way to "strengthen" their argument when IMO it weakens it. If so many people are reliant on robots and code for emotional support then they need to seek help or seek real, human connection. It's not healthy to talk to these "shapes" all day. What's even more concerning is that trauma you're dumping is then being used to "enrich the conversation datasets."
ETA: I also think Discord is probably taking action now so they can release their own version later on without any competition, but this still could've been mitigated. Even if they got them on the tokens aspect they could've had a really strong argument considering that's what they were advised to do.
There's other posts about it on X, but they don't blur out emails so I won't share.