Children are turning themselves into monsters and, quite frankly, it is your fault. You initiated the creation of this technology, then you allowed it to slip through your fingers.”Miriam’s jaw tightened. “I disagree, but now is the least optimal time imaginable for assigning blame. People are dying, and I will not stand around debating semantics with you while they are.

— G.S. Jennsen

What use was time to those who'd soon achieve Digital Immortality?

— Clyde DeSouza

Her perception was propelled backward, as if it were being pulled into a vortex. She slammed into her body, and her eyes flew open with a gasp.“Alex?”She sat straight up in the chair and grabbed Caleb by the shoulders. “We have to save them.

— G.S. Jennsen

We may not have been programmed to be angry, but anger can be learnt. Perhaps not in the way humans feel it, driven by ignorance and hate, but in response to the damage and injustice caused by those things, isn't that logical? Doesn't that make sense? Why give us a sense of morality without the ability to express it properly?

— K. Valisumbra

She returned his salute with a sly smile—a rare enough event that he eyed her suspiciously.“Admiral Solovy, are you wearing a shit-eating grin because we won here today, or is there something else I should know?”“There’s something else you should know.

— G.S. Jennsen

People always have such a hard time believing that robots could do bad things.

— Rita Stradling

Human beings, viewed as behaving systems, are quite simple. The apparent complexity of our behavior over time is largely a reflection of the complexity of the environment in which we find ourselves.

— Herbert A. Simon

A.I. Might be straight out of science fiction, but it's going to turn into man's worst nightmare.

— Anthony T. Hincks

Why give a robot an order to obey orders—why aren't the original orders enough? Why command a robot not to do harm—wouldn't it be easier never to command it to do harm in the first place? Does the universe contain a mysterious force pulling entities toward malevolence, so that a positronic brain must be programmed to withstand it? Do intelligent beings inevitably develop an attitude problem? (…) Now that computers really have become smarter and more powerful, the anxiety has waned. Today's ubiquitous, networked computers have an unprecedented ability to do mischief should they ever go to the bad. But the only mayhem comes from unpredictable chaos or from human malice in the form of viruses. We no longer worry about electronic serial killers or subversive silicon cabals because we are beginning to appreciate that malevolence—like vision, motor coordination, and common sense—does not come free with computation but has to be programmed in. (…) Aggression, like every other part of human behavior we take for granted, is a challenging engineering problem!

— Steven Pinker

Emotions - Happiness, anger, jealousy... Is the mind experiencing 'presence' in our holographic existence.

— Clyde DeSouza