Which brings up some ethical questions about cloning, twinning, and the rights of subsequent copies of sapient AIs that are held in stasis.
Cloning humans is illegal, but not cloning persons, so is it legal for a sapient AI to make backup copies of herself?
If she does make backups, what rights do they have? Does she have the ethical or legal right to hold them in stasis, given that they would be fully functional NHS’s on their own if they were activated?
If one were accidentally activated it would presumably be considered a separate person from the original, but that would bring up the philosophical problems touched on by Thomas Riker and explored in more depth by the Crichton clones in season 3 of Farscape.
Oh, more “technobabble”? Actually, it’s not the ACPI code’s job to prevent data corruption during a shutdown, it’s the operating system’s. But this is certainly a less serious error than the last one, so I’ll try to suppress my inner nerd on this one… ungh…
This is an alternate world with Mad Scientists, including Mad Programmers. There’s no guarantee that ACPI code does the same thing or even that the acronym means the same thing. =)
In a fictional world things don’t always work exactly the same, this thought saves me from typing many a pedantic rant.
=)
It’s not even hard to come up with an expansion that sounds like it would od the right thing. Advanced Corruption Prevention Interface! (Though preventing corruption doesn’t really sound like a Mad Science thing to want to do.)
And that isn’t what’s being claimed. Two sentences mean that the ACPI is somehow stopping them from logging out, I admit to not understanding that well enough, and that for some reason a force shutdown will screw up the data. Best thing I can think of is because the hardware isn’t mapped properly there is some sort of feedback or something.
Oh, that’s legit enough at a first approximation… For instance, having acquired a new graphic card that lacks XP drivers my old alt-boot XP OS now runs on system default video drivers – and as a result of missing (presumably ACPI) support, “standby” got greyed out…
Anyway, what gets me is every single virtual world’s insistence that You Cant’ Just Log Out Any Damn Time You Want And That’s That Dammit. I get it that’s a necessary plot device, but all of the “justifications” are simply so wrooooooong…
Just a brain in a jar plus assorted electronics that allow him to interact with the outside world. If the code controlling those electronics is corrupted, Nick could become isolated in his jar.
The implication would be he’d lose his software interface and wouldn’t be able to interact with anything, therefore becoming just a preserved brain in a jar, and not a viable AI ‘person’.
Pedantic Rant ?
Pedantic Rant !
I won’t be happy, without that rant
Pedantic Rant ?
Pedantic Rant !
I want to hear it, so stop I can’t
Pedantic Rant ?
Pedantic Rant !
I learn so much, that I start to pant
Peda~~a~~antic Rant !
Toreador Song, by Bizet – well, part of the Marx brothers’ version, anyway
I kinda forgot madblood had a moment of sainity and made backups of her
She was emancipated shortly after the run of Narbonic (it’s in the closing strips), but presumably, she keeps copies.
Which brings up some ethical questions about cloning, twinning, and the rights of subsequent copies of sapient AIs that are held in stasis.
Cloning humans is illegal, but not cloning persons, so is it legal for a sapient AI to make backup copies of herself?
If she does make backups, what rights do they have? Does she have the ethical or legal right to hold them in stasis, given that they would be fully functional NHS’s on their own if they were activated?
If one were accidentally activated it would presumably be considered a separate person from the original, but that would bring up the philosophical problems touched on by Thomas Riker and explored in more depth by the Crichton clones in season 3 of Farscape.
It’s too bad that she’s only virtual. We could put the two jarred brains on a shelf together and see what happens.
And maybe Marty Feldman might carry them off to Transylvania to experiment.
Well, someone here learns from experience.
One of her backup selves…
So, is the name “Daughters of Air” more descriptive than just a statement of solidarity?
You read my mind.
(Granted, a quick read.)
Wouldn’t that be a statement of a lack of solidarity?
Oh, more “technobabble”? Actually, it’s not the ACPI code’s job to prevent data corruption during a shutdown, it’s the operating system’s. But this is certainly a less serious error than the last one, so I’ll try to suppress my inner nerd on this one… ungh…
This is an alternate world with Mad Scientists, including Mad Programmers. There’s no guarantee that ACPI code does the same thing or even that the acronym means the same thing. =)
In a fictional world things don’t always work exactly the same, this thought saves me from typing many a pedantic rant.
=)
It’s not even hard to come up with an expansion that sounds like it would od the right thing. Advanced Corruption Prevention Interface! (Though preventing corruption doesn’t really sound like a Mad Science thing to want to do.)
“so I’ll try to suppress my inner nerd on this one… ”
NEVER suppress your inner nerd!!!*
*now where you direct it is another thing….
And that isn’t what’s being claimed. Two sentences mean that the ACPI is somehow stopping them from logging out, I admit to not understanding that well enough, and that for some reason a force shutdown will screw up the data. Best thing I can think of is because the hardware isn’t mapped properly there is some sort of feedback or something.
Oh, that’s legit enough at a first approximation… For instance, having acquired a new graphic card that lacks XP drivers my old alt-boot XP OS now runs on system default video drivers – and as a result of missing (presumably ACPI) support, “standby” got greyed out…
Anyway, what gets me is every single virtual world’s insistence that You Cant’ Just Log Out Any Damn Time You Want And That’s That Dammit. I get it that’s a necessary plot device, but all of the “justifications” are simply so wrooooooong…
I imagine she *would* keep her backups up-to-date.
Wait . . . Nick is just a brain in a jar, right? And the jar is hooked in to a helicopter.
Just a brain in a jar plus assorted electronics that allow him to interact with the outside world. If the code controlling those electronics is corrupted, Nick could become isolated in his jar.
The implication would be he’d lose his software interface and wouldn’t be able to interact with anything, therefore becoming just a preserved brain in a jar, and not a viable AI ‘person’.
The above should have been a reply to Manifesta, oops
But what about Boney?
He’ll always have Austerlitz.
There are times when the straight lines just feed themselves to you. It might be considered “too easy” at this point.
Pedantic Rant ?
Pedantic Rant !
I won’t be happy, without that rant
Pedantic Rant ?
Pedantic Rant !
I want to hear it, so stop I can’t
Pedantic Rant ?
Pedantic Rant !
I learn so much, that I start to pant
Peda~~a~~antic Rant !
Toreador Song, by Bizet – well, part of the Marx brothers’ version, anyway
Am I the only one here having Sword Art Online flashbacks…?
Yep, guess so, nyao. Gotta stop reading so much manga, I suppose…
Back in my day, this would be called a .hack/sign reference.