Some of your coworkers seem to look forward to the mandatory therapy sessions. You wonder what they talk about with the AI therapist, most often as your session approaches and you have to decide what parts of the truth you’ll let yourself talk about.
You can talk about anything you want. Interactions in a therapy context aren’t available to your managers and are only used in an anonymized way to monitor the overall health of the company.
It’s only a program. You could ask it if it feels embarrassed for saying something as ridiculous as “in an anonymized way”, but it’d only use that as another input in its modeling. It’s as incapable of feeling that type of shame as your car software or somebody in marketing.
Besides, you are fairly sure the company and the implant’s maker lie about the device in your brain not working as a lie detector and the implant happens to be the main thing you don’t want to talk about.
* * *
Perhaps you’d feel better if you hadn’t done therapy before entering the company. You have to, now — an FDA requirement for anybody with a cognitively interfaced brain implant — and restricted to the company’s own AI — too many secrets in your head, operational and otherwise; some of them you even know yourself. A good therapy AI, goes the theory, would be able to glean aspects of the implant’s software based on your response patterns as easily as it can model the rest of you.
As easily as it can model you. Easy mistake to make even in the privacy of your own mind.
It’s private. Your contract says so.
* * *
You seem anxious lately. What’s on your mind? (You want to laugh at the phrasing, don’t, think the implant registered the intent anyway and informed the AI.)
The Actaeon deadlines, mostly. It’s an important project and I know how much it matters to the company.
The image of a man rendered on your screen nods. Sleep problems aren’t uncommon in this sort of situation. Have you been sleeping well?
The therapist knows to the minute how much you’ve been sleeping and when. Better than you know, unless you really haven’t slept more than an hour or two a day for the last few months. You wouldn’t put it past the implant’s capabilities to keep you functional in that situation if not for the fact that if they could do that then it’d be everybody’s de facto schedule.
You reply that you think you’ve been sleeping the usual four hours. You used to need six before the implant. Another thing you both know and neither one says aloud.
The image nods again and the session ends soon afterward. You are happy to go back to work. You aren’t always. Motivated, yes. You haven’t had once to push yourself to start working ever since you entered the company and got the implant.
* * *
Before the company and the implant and everything else you had already done mandatory (read: your father told you to) therapy with a designated therapist (read: the one your father chose and paid for) to address what she ended up describing as episodes of loss of emotional control stemming from environmental stress. “Environmental stress” was an euphemism and loss of emotional control was something you were already familiar with – but until then you had never thought about what it was that you were failing to control. It turned out that minds could have desires they themselves didn’t know about. It wasn’t a moral failure but a fact of neural architecture. Everybody had a subconscious.
The therapist didn’t think dreams were a good window into the subconscious. You are who you are in everything you do. Don’t worry about decrypting your dreams. Look at what you do. What things are you doing you don’t know you are doing?
You stopped working with her a few years before entering the company. If you could, you would tell her that your dreams have changed since you got the implant.
* * *
You can’t get a brain scan except in a company-approved clinic. The implant connects to a separate wireless network inside the office and enters a pseudo-Faraday cage mode when you leave. Those aren’t the only safety measures: the Security department considers external infiltration a serious issue.
(There’s still a tiny scar on the back of your head where they inserted the implant’s main mesh and left a small access panel for the annual hardware upgrades.)
Security hasn’t forbidden trying to learn about the implant by introspection and self-observation, as if it were a compulsion or trauma. It probably means it’s useless. You try anyway.
* * *
Like everybody else they hire, you were very good before entering the company and much better afterward.
You have more focus. Less of a social life, not that you really had one. You no longer stress about the societal negative externalities of your work. No nightmares about dead children. A great salary. The indescribable feeling of being so good at what you do you don’t remember the last time you made a mistake or stood still confused about the next step.
The AI never asks you if you think about quitting. It’s not that they don’t want to put the idea in your head. They know you won’t. You know you won’t.
Maybe it’s not even the implant, you think sometimes. The thought makes you feel ill in a different way, but at least you’re almost sure it’s yours.