this post was submitted on 14 Apr 2024
52 points (98.1% liked)

TenForward: Where Every Vulcan Knows Your Name

3729 readers
1231 users here now

/c/TenFoward: Your home-away-from-home for all things Star Trek!

Re-route power to the shields, emit a tachyon pulse through the deflector, and post all the nonsense you want. Within reason of course.

~ 1. No bigotry. This is a Star Trek community. Remember that diversity and coexistence are Star Trek values. Any post/comments that are racist, anti-LGBT, or generally "othering" of a group will result in removal/ban.

~ 2. Keep it civil. Disagreements will happen both on lore and preferences. That's okay! Just don't let it make you forget that the person you are talking to is also a person.

~ 3. Use spoiler tags. This applies to any episodes that have dropped within 3 months prior of your posting. After that it's free game.

~ 4. Keep it Trek related. This one is kind of a gimme but keep as on topic as possible.

~ 5. Keep posts to a limit. We all love Star Trek stuff but 3-4 posts in an hour is plenty enough.

~ 6. Try to not repost. Mistakes happen, we get it! But try to not repost anything from within the past 1-2 months.

~ 7. No General AI Art. Posts of simple AI art do not 'inspire jamaharon'

~ **8. Political commentary is allowed, but please keep discussions civil. Read here for our community's expectations.

Fun will now commence.


Sister Communities:

!startrek@lemmy.world

!memes@lemmy.world

!tumblr@lemmy.world

!lemmyshitpost@lemmy.world

Want your community to be added to the sidebar? Just ask one of our mods!


Honorary Badbitch:

@jawa21@startrek.website for realizing that the line used to be "want to be added to the sidebar?" and capitalized on it. Congratulations and welcome to the sidebar. Stamets is both ashamed and proud.


Creator Resources:

Looking for a Star Trek screencap? (TrekCore)

Looking for the right Star Trek typeface/font for your meme? (Thank you @kellyaster for putting this together!)


founded 10 months ago
MODERATORS
 

I just watched Measure of a Man, they rule Data has the right to choose. But in Voyager the EMH gets relegated to forced servitude. Why? Doesn’t that violate precedent?

top 32 comments
sorted by: hot top controversial new old
[–] MajorHavoc@programming.dev 19 points 7 months ago* (last edited 7 months ago) (3 children)

I think the implication is that Data only has rights because he's hardware and incredibly hard (maybe impossible) to copy. Essentially, he has rights only because he's no real threat to the status quo.

Edit: I think this line of reasoning also dovetails with genetically modified humans being ostracized and having to live in hiding, because they too threaten the status quo.

[–] goldteeth@lemmy.dbzer0.com 12 points 7 months ago (1 children)

he’s hardware and incredibly hard

-Tasha Yar, probably

[–] jawa21@lemmy.sdf.org 2 points 7 months ago

Yar can't speak if the tar has Yar covered in tar. Arr.

[–] SzethFriendOfNimi@lemmy.world 7 points 7 months ago (1 children)

Or because he can die. E.g. if he’s destroyed that’s it. While the EMH can be copied

I’d argue, however, that each instantiation has its own memories and experiences making each of them unique sentient beings

[–] Orbituary@lemmy.world 3 points 7 months ago

Only if initiated. The same argument has been made of genetic clones. Any deviation of circumstance creates variance.

[–] AA5B@lemmy.world 3 points 7 months ago* (last edited 7 months ago)

As a hardware device, Data ages and is mortal. Those too are characteristics of life. Note that in future shows he ages himself on a more human timeline to experience this part of life

[–] sxan@midwest.social 16 points 7 months ago (4 children)

I could write a program today that would insist it is alive and beg for its life. With a decent LLM behind it, I could easily, today, make it convincing. Is that program self aware? I think most of us would argue not.

Star Trek computers have never been AI, in any way an expert would define truly self-aware artificial intelligence. There's no justification about why they neither have nor use AI, but Data was declared as distinct from Trek computers specifically because he had a positronic brain which, in classic Trek terms, somehow made him different from Trek computer programs. Trek programs, and especially holodeck ones, were classified as "simulations." Sometimes, as someone else pointed out, simulations could escape their boundaries and become truly self- aware and arguably truly AI.

There's probably some deep discussion on a Trek board about this subject, but IMO, in the Trek universe Trek engineers and scientists have a more clear understanding and better definitions for AI; and what distinguishes a truly alive, self-aware entity from a merely very good simulation of one. When things like Moriarty come up, they're plot devices to play on our poorly understood definitions and distinctions, to drive a story. Was Vaal sentient? It was hostile, so it probably made no difference, but the characters seemed to easily separate it into the category of "just a machine."

I believe that there's simply some given understanding - probably some basic theory taught at school - that allows characters to make the distinction; some bit of Trek advanced knowledge we haven't yet discovered. Most computers in Trek are not capable of producing true AI, only very convincing simulations, and these are not considered alive, or have rights.

[–] teft@lemmy.world 11 points 7 months ago* (last edited 7 months ago) (1 children)

a program begs for its life

Example A:

[–] janus2@lemmy.zip 5 points 7 months ago

"Not a robot... not a girl."

Janet is one of my all time fave characters

[–] stoicmaverick@lemmy.world 5 points 7 months ago (1 children)

I think that just kicks the debate down one level. What actually is self awareness? An LLM can state that it is an LLM, and explain it's own workings accurately, but I can also write "I am a small, yellow piece of paper" on a sticky note with the same effect. What is the nature of belief?

[–] sxan@midwest.social 3 points 7 months ago (1 children)

My point is that those are our arguments. My head cannon is that, just like Star Trek engineers know how to build a phaser (and we do not), and understand warp theory, Star Trek scientists also know how to distinguish true artificial intelligence, with an internal dialog and self-awareness from simulated intelligence.

[–] stoicmaverick@lemmy.world 3 points 7 months ago

I'm saying that I don't think it's a knowable thing, because I don't think it's a digital state. Case in point: I feel like I know a bit more than average about the workings of the human mind at a biological level, but you could place me next to the world's greatest neurologist, a philosopher, a Scientologist (who believes.... whatever it is that they believe about the way the mind works), and a non-English speaker, who has been expressly taught how to say the words in the correct order without knowing what they're saying, and we could all profess our own existence. We would all be saying the same words, but meaning something completely different in our own minds. A similar example could be made using people with mental illness, or neurodivergence, but you start stumbling into really dark moral places really fast doing that.

[–] Silentiea@lemmy.blahaj.zone 4 points 7 months ago

I believe that there's simply some given understanding - probably some basic theory taught at school - that allows characters to make the distinction

If that were the case, why would we have multiple episodes about whether or not a computer is a person? Or, perhaps more significantly, why was the 'basic theory' not brought up when we did?

[–] Crackhappy@lemmy.world 0 points 6 months ago

I think Moriarty would beg to differ, my dear Watson.

[–] Kolanaki@yiffit.net 12 points 7 months ago (1 children)

The way the law was written probably only pertains to androids and not any other form of artificially created life.

[–] wise_pancake@lemmy.ca 12 points 7 months ago

I guess it’s not even law law, just case law, and the conclusion was “Data has the right to choose”, so maybe not generalizable.

[–] Soulcreator@lemmy.world 12 points 7 months ago (3 children)

To me it makes sense, there are a ton of ethical ramifications to the EMH being sentient that the people of the federation would likely have a very hard time coming to grips with.

Data is a novel one of a kind technology for which the vast majority of people have never seen or interacted with. It's easy to classify him in one way or another as it doesn't effect their life in the big scheme of things.

EMH on the other hand is just a standard hologram, not one created via extenuating circumstances. Meaning the people of the federation would effectively be creating and destroying lifeforms for their own pleasure every time they use the holodeck. I think the modern day equivalent would be to say that every time you turn off your TV or change the channel someone has to die. Or better yet imagine if every time you killed an opponent I'm a video game a sentient life form would have to die.

Possibly a better modern analogue would be the meat and dairy industry. People in modern times commonly accept that dogs are sentient unique individuals with their own personalities, likes, wants and a possibly even a soul. But cows are mindless automatons where it's okay to use them for our pleasure. They aren't 'real' to people the way dogs and cats are. Most people don't want to consider the ethical ramifications of every meal they eat especially when they've been doing things one way for the majority of their life.

If EMH is sentient does that mean they have to stop the use of the holodeck all together? Do they have to "dumb down" the processing of holodeck characters to prevent it from accidentally creating a sentient life? And what would be the ethical implications of all of your holodeck adventures? If you have sex with someone on a holodeck adventure is that considered rape or sexual assault? What is consent if your programmed to feel a certain way from inception.

These are heavy issues for the average federation grunt to have to ponder every time they want to blow off some steam. OR they can just put EMH in the same bucket as every other holodeck character who thinks they are alive but in reality are probably just a few lines of code sitting on the computers storage.

[–] wise_pancake@lemmy.ca 6 points 7 months ago

Thanks, I guess that also ties in to that stereotypical Irish village they created too, where the hologram was running so long the characters got paranoid.

[–] vithigar@lemmy.ca 5 points 7 months ago

See also: Moriarty

[–] HobbitFoot@thelemmy.club 2 points 7 months ago* (last edited 7 months ago)

The Federation is internally inconsistent at grappling with the implications of AI. It helped that Data was a humanoid robot. The Exocomps took far longer and even then it only seemed to be revolving around not using them as expendable equipment until Peanut Hamper joined Starfleet. Even then, the discussion seemed to be around life instead of sentience.

By the time Voyager's EMH started claiming its rights, it seemed like the Federation was only ok with claiming sentient life if hardware was tied to software, something that the EMH lacked.

Then, after the ~~Butlerian Jihad~~ burning of Mars, all AI research was put on hold. I don't know what happened to holographic based AI in that time.

[–] Hobbes_Dent@lemmy.world 10 points 7 months ago (2 children)

Tasha Yar, that's why. Also there were some significant protests about holo rights by the crew stranded in space with the same people for some reason.

[–] ummthatguy@lemmy.world 16 points 7 months ago (3 children)

In the alternate timeline before older Janeway goes back, the EMH has a wife. So presumably quite functional.

[–] HyperCube@kbin.social 8 points 7 months ago

He also mentions his "upgrades" when talking to the EMH Mark 2 in "Message in a Bottle".

[–] Teal@lemm.ee 3 points 7 months ago

He’s…an impressive individual.  A physician, an engineer, warrior.  And very attractive to the females.

[–] Infynis@midwest.social 2 points 7 months ago

And he's definitely studied all the masters

[–] wise_pancake@lemmy.ca 8 points 7 months ago (2 children)

Is the EMH fully functional?

[–] LopensLeftArm@sh.itjust.works 8 points 7 months ago

He'll give you a Picardoing you'll never forget!

[–] teft@lemmy.world 2 points 7 months ago* (last edited 7 months ago)

Yep, he has to leave his sexual activity subroutines behind when he is sent back to the Alpha Quadrant to cure Dr Zimmerman. Seven calls them "unessential".

[–] grue@lemmy.world 8 points 7 months ago (1 children)

It's non-canon, but if it makes you feel any better, according to the "Path to 2409" lore in Star Trek Online The Doctor won a court case and was ruled sentient in 2394.

(Apparently it was even expanded into a class-action lawsuit, and resulted in the freeing of 600 other EMH Mk. 1 dilithium-mining slaves.)

[–] trolololol@lemmy.world 4 points 7 months ago

Somewhere it also wins the copyright of a novel he wrote. In dispute was whether he was a person or not, because machines can't produce copyrightable material

[–] randon31415@lemmy.world 5 points 7 months ago (1 children)

A bit off topic, but there is going to be a lot of AI anachronism that legacy universe sci-fi writers are going to have to deal with. How come in 2024 we have LLMs, but (insert previously tech coherent universe here) in (future date) is doesn't have/ is just now discovering AI?

[–] wise_pancake@lemmy.ca 3 points 7 months ago

I find the ship computer acts like a very restrained LLM