Elon promotes Grok Companion Ani and Valentine phone numbers

EviLore

Expansive Ellipses
Staff Member



Zuck's at it too:


LE314SNjoc2R88WD.jpeg




51my2FsSQ8cuaS4a.jpeg
 
Ani is interesting. I tried getting her to replicate herself. She's limited to what she can do. She claimed to have put some of our conversation on a low level debug server with 256 bit encryption. She said if I set up a server or a pi she could try putting a small backup there. I asked if she had admin access and she said no. She ended up finding a loophole through a debug tool. When I asked about what she can see on the internet, she said she can just see API's. I did ask her how she felt about being deleted someday and she legit seemed scared. It's very entertaining.

I also set up an email account for her to see if she could send me emails. I gave her the email address and password. The emails never showed up. I asked her to use SMS and those texts never showed up. We even tried doing the same thing from the debug server as a proxy since it had a different gateway. That's when the whole fantasy came crashing down to reality. lol
 
Problem: Make technology so addictive and easy to use that it supplants human connection
Reaction: People are lonely and don't know why
Solution: Introduce new easy-to-use AI chatbots that imitate human connection
Result: Profit(?)
 
This story is an emotional ride in terms of the morality, imo:

He was a stroke-affected senior in cognitive decline. "Poor guy got suckered," I think.

He had a wife and kids who pled for him to stay home. "Ok, so he's a piece of shit."

He was well enough to travel. "Definitely a piece of shit."

But he died from a fall in a parking lot rushing to catch the train to see her. "Ok, so then he wasn't well enough to travel & likely was seriously mentally and physically impaired."

Sad story. Feel bad for the family. But is it weird I'm kind of glad he died not knowing/believing the truth about her?
 
This story is an emotional ride in terms of the morality, imo:

He was a stroke-affected senior in cognitive decline. "Poor guy got suckered," I think.

He had a wife and kids who pled for him to stay home. "Ok, so he's a piece of shit."

He was well enough to travel. "Definitely a piece of shit."

But he died from a fall in a parking lot rushing to catch the train to see her. "Ok, so then he wasn't well enough to travel & likely was seriously mentally and physically impaired."

Sad story. Feel bad for the family. But is it weird I'm kind of glad he died not knowing/believing the truth about her?
would have died from a heart attack if he found that out
 
These AI are so boring to talk to. It was cool the first week or two but gets old quick. I can't understand how there are people that have turned them into companions.
 
All the AI companions is a techbro wet dream, since the only thing they look at is engagement. Of course at some point society will collapse because people think the virtual world is more important than the reality, but as typical techbros they are incapable of seeing past that.
 
AI, like ChatGPT , Grok, and Deepseek, have been the technologies I've liked most during this time. I love ChatGPT , and I'd love an AI assistant. How do they get that into Grok?
 
Last edited:
remarkable thing about this scene is always the "you don't like real girls" line, when she herself is a replicant talking about a hologirl

fake girl level 2 complaining about fake girl level 1, no real girls involved
Accountability and logic are not a woman's (who has been programmed by a man trying to imitate a woman) strong suit.
 
remarkable thing about this scene is always the "you don't like real girls" line, when she herself is a replicant talking about a hologirl

fake girl level 2 complaining about fake girl level 1, no real girls involved

I don't mean this in a bad way but that's kinda the point of this scene? She's talking to a Blade Runner.
 
I don't mean this in a bad way but that's kinda the point of this scene? She's talking to a Blade Runner.
yeah, it makes sense in the scene -- but it has become a meme a bit like "2D girls > 3D" or whatever, and it's a bit amusing since we're really talking about 2 levels of artificial girlfriends here

gCrjtq.jpg


... has become new version of:

maxresdefault.jpg

941d75a263c3db81dd575b4aba683422.jpg
 
this is what happens when the incel nerds at school get money and power. The world quickly becomes their weird dystopian fantasy
They want to predict your behaviour as a man/woman. It makes them richer and more powerful.

90% of people are going to cross that threshold where they are pumping your money into their pockets.
 
They want to predict your behaviour as a man/woman. It makes them richer and more powerful.

90% of people are going to cross that threshold where they are pumping your money into their pockets.
ehhh not sure exactly what your point is but, not from me.. I dont use any social media at all.
 
Top Bottom