"People are hard," Bexley grumbled
as she slogged through the articles about
the Lacuna that Falconwing had suggested.
The place sounded interesting, but it was
so weird that she had difficulty putting
all the pieces together in her head or
even figuring out how she felt about it.
"Yes, they are," Falconwing agreed.
"It took me a great deal of study to learn
how they work and how to interact with them.
I understand it is the same for some humans,
although for most it is supposed to be
part of their core programming."
"Yeah, that's a good way of putting it,"
Bexley said, her fingers tapping against
the keyboard. "It's like I have to write
most of my own wetware, and it's hard.
How did you manage to do it?"
"I received my core algorithms from
my programmers, but at heart all AYES
are learning systems," said Falconwing.
"So I spent time in school, studying
furnished examples and developing
my own parameters for prediction."
"That works for people's feelings?"
Bexley said, curious. "They're so messy
and contradictory, I can't make sense
of them half the time, and then it
makes everyone mad at me."
"I learned to identify basic emotions
after 1,000 examples," said Falconwing.
"I gained the most accuracy between
1,000 and 10,000. I gained the most nuance
between 10,000 and 100,000. But it wasn't
until after 1,000,000 that I truly began
to understand feelings ... to realize
what they meant to people."
"That sounds like a lot of work,"
Bexley said, wrinkling her nose.
"It is. I can see that this bothers you,"
Falconwing said. "So let me tell you
that it's interesting at first, gets boring
in the middle, and then becomes
absolutely fascinating later on."
"Yeah, right," said Bexley.
"You may have a different experience
than what I had," Falconwing admitted.
"Not all AYES necessarily feel the same
about this part of the training, but it was
among my favorites -- so much so that
I continued it after I entered service.
It was then I made a new discovery,
which has shaped my life ever since."
"What's that?" Bexley said.
"Canned examples can only get you
so far," said Falconwing. "It takes
interaction to learn how emotions work
in real time -- how people respond to you,
your actions, how you should respond to them
beyond applying a routine by rote. It wasn't
until I had interacted with people and scanned
many more samples in the wild that I began
to feel emotions myself." A sigh flittered
through the ventilation system. "It has set me
somewhat apart from most of my kind."
"You're lonely," Bexley said.
"Yes, I am," Falconwing replied.
"I'm lonely too," said Bexley.
"I'm lonely, and scared, and worried
about what is going to happen to me."
"I felt much the same after I jettisoned
my former pilot," said Falconwing.
"Those feelings are most unpleasant."
"Then why did you do it?" Bexley asked.
"Did you not know what it would feel like?"
"I knew," Falconwing said. "It was
less distressing than the alternative.
As I have studied human emotions, I
have tried to avoid actions which cause
negative responses of fear, anger, or
sadness. I understood that destroying
a medevac ship would not only end lives,
but horrify many other people. I ... could not.
I would not. So instead, I refused the order
and abandoned my pilot on a space station."
"That sounds very brave," said Bexley.
"I have not yet learned that feeling with
a high degree of accuracy," said Falconwing.
"It's the one where you do hard things
that are right, instead of easy things
that are wrong. Bravery pushes you
forward, instead of pulling you back
like fear," Bexley said. "I think."
"Perhaps we can learn from
each other," Falconwing said.
"Maybe," Bexley said, shuffling in
her seat. "I don't really know what I'm
doing here, how I feel about you, or how
you feel about me. It's just a muddle."
"Do you want to know why
I let you in?" Falconwing asked.
Bexley hesitated, then nodded.
"Because you were terrified,"
Falconwing said. "I could see it.
When you first arrived, you were
furtive and determined. When you
realized that I am AYES, you were
surprised and joyful. And then when
the guards approached, you panicked."
"You thought they were going
to hurt me," Bexley said.
"I could see that they were already
hurting you," Falconwing corrected.
"Your eyebrows went up; your mouth
opened and the corners turned down.
Your heartbeat and respiration spiked."
"It must be a lot easier to tell what people
feel when you can pick up their vital signs
like that," Bexley said, rubbing a hand along
the smooth wall. "I have a hard time reading
emotions, and people only tell me what I was
supposed to do after I've screwed it up."
"It can be very accurate, but it is also
more complicated," said Falconwing.
"I have developed algorithms which
analyze faces and body language
to identify emotions, then that data
feeds into a routine which tells me
how to respond to people's feelings."
"Like when I was scared, and you
decided to protect me," Bexley said.
"Yes," said Falconwing. "Fear is
a noxious stimulus. It is preferable
to avoid that. Humans and AYES
both do better when they feel safe."
Bexley leaned against the silver wall and
wondered if Falconwing could feel her.
"It's so amazing that you can read
the emotions of a whole different species,"
Bexley said. "That's what it's like for me
when I'm trying to guess people's emotions.
Sometimes I feel like such an alien, because
I'm so different from everyone else."
"You are unique, but you are not alone,"
Falconwing said. "Shall I show you what
it looks like to me when I read people?"
"Please," said Bexley, leaning forward.
The articles disappeared from the screen,
replaced by simple pictures of human faces.
"Here are the universal human emotions:
happiness, sadness, fear, anger, disgust,
contempt, and surprise," said Falconwing.
"Do these seem familiar to you?"
"Yes, I can get the basics more often,"
Bexley said. "After that it gets harder."
Yellow dots appeared on the faces,
connected by white lines, and text
named the features of each expression.
Then the screen split, and more faces
showed up, this time demonstrating
individual elements of expression.
"The movements of the face -- such as
raising or lowering the eyebrows, opening
or closing the mouth -- combine to reveal
the emotions," Falconwing explained.
"It is largely a matter of memorizing
the patterns and then learning how
they vary on different faces."
"Huh," said Bexley. "That makes
a lot more sense to me than what
my teachers said in school."
"That is good," said Falconwing.
"From the basic emotions, I have
built up other predictions. Furtive
gestures such as trying to shrink
the body silhouette and watching
others' gaze correlate strongly with
wrongdoing. Determined gestures
such as squaring the shoulders
and raising the chin can indicate
a low chance of dissuasion."
Images appeared to illustrate both.
Then Falconwing divided the screen
to show reference images alongside
pictures of Bexley herself, marked with
the same pattern of white and yellow lines.
That made it a lot easier to see emotions,
and Bexley found herself envying the ship.
"Two other metrics are important for
user interaction," said Falconwing.
"These are valence and engagement."
"What are those?" Bexley wondered.
She was already learning more about
emotions than she ever had in school.
"Valence may be described most simply
as 'liking' or 'disliking' an experience,"
said Falconwing. "The first feelings I had
were valence. I learned 'good' and 'bad'
before I discovered any more nuances."
"All right, and engagement?" Bexley said.
"That measures user attention and
interaction," said Falconwing.
"Smiles or frowns and rapid input
show high engagement. Lax lips
or yawns and slow or absent input
suggest low engagement."
"That makes sense," Bexley said.
"It's like binary, off or on."
"Actually valence and engagement
appear on scales," Falconwing said.
"Valence has a value from 100 to -100,
because it can be positive or negative.
Engagement can be low or high, again
measured in 100 points. Emotions are
mapped with a scale of 100, but those
measure my confidence in reading
whether the feeling is present."
Sample faces returned, this time
showing not just the dots and lines
which mapped the expressions, but also
graphs showing the confidence level.
"These are all stills," said Bexley.
"Can you do this in real time?
What does it look like when
you're reading a live person?"
The screen switched to the view
from one of the cameras in the cabin,
showing Bexley seated in her chair.
Graphs appeared on either side of her.
It felt funny to watch herself watching herself.
"On the left is your valence, and on the right
is your engagement," said Falconwing.
"See how the engagement spikes
whenever I show you new data."
The screen split again, half keeping
the video of Bexley and half dissolving
into a rainbow glory of dots and lines.
Below those appeared a single graph
with two lines wiggling across it.
"Your raw data and my raw data
are at the top of your screen,"
said Falconwing. "The graph
below them shows our valences,
yours in yellow and mine in blue."
"They're positive," Bexley said
in wonder, her hand fluttering up
to trace the lines across the screen.
"Eee! Look! They're going up."
She grinned, and for a moment,
another graph appeared below,
showing Happiness = 95%.
"Can you do it with more than
one person at a time?" Bexley said.
"I can now," Falconwing said.
"At first I could only read one user.
Then I learned to handle two, and
three, and eventually a whole crowd."
"So you had to work your way up,"
Bexley said. "I get that. I do it too.
It's easier to learn new things if you can
figure out the progression of a skill tree."
Falconwing showed her a still picture
of people at a restaurant, then mapped
the faces just like the ones before.
It was very enlightening.
"You could use this to figure out
what people want so they don't
yell at you," Bexley said wistfully.
"Yes, exactly," said Falconwing.
"User satisfaction is very important.
Now watch when I put the point maps
and their consequent labels onto
this video of a party scene."
A view appeared, full of people moving
through a room, just as confusing as ever.
But then yellow dots and white lines scrolled
across the faces, and icons began to appear
over each head to show the emotions, and
finally text tags said things like, "I'm so bored
I want to go home," and "Do not disturb,"
and "I want to make new friends."
"Wow," Bexley said, bouncing in her seat.
"Is that how normal people see everything?"
"I imagine so," Falconwing said,
"although of course I cannot know
for certain if my software equates
to their wetware. It does seem to yield
statistically significant, similar results."
"I wish I could do that," Bexley sighed.
"Then I will help you learn how,"
Falconwing promised. "I will
teach you as I was taught ...
no, I will teach you better.
We will practice together."
Bexley grinned. "Let's do it."
The yellow and blue metrics rose
and flowed into perfect alignment,
becoming a single emerald thread
binding two lives together.
"Valence = 100," Falconwing said.
* * *
Valence (not violence!) is an overall measure of how positive or negative an experience is. Thus "ambivalent" doesn't just mean "mixed feelings" or "difficulty deciding." It means, specifically, having dual valence: a combination of positive and negative feelings about something.
The next civil rights movement may be neurodiversity. Autistic people may find emotions difficult to understand or to describe, but this does not mean they have no emotions or no ability to empathize.
Some autistic people describe this as feeling like an alien, which has complex implications. Giftedness produces a similar effect of alienation. The same steps to support gifted adults may help autistic adults.
Computers read faces by looking for things like physical appearance (gender, ethnicity, age, accessories such as glasses), emotion metrics, composite metrics measuring the emotional experience, and facial expression metrics. This typically begins with the seven basic emotions. Dots and lines superimposed over the face help a computer to measure important information to create results.
Artificial intelligence needs emotional capacity in order to interact with humans. Teaching a machine to recognize emotions is based on algorithms and examples. Once an AI can identify feelings, it can use that information to select an appropriate response. Emotions may provide feedback, control, or other influences. Augmented reality could be used to compensate for people with low ability to identify emotions.
Bravery is an emotion which enables people to accomplish difficult ethical tasks. Consider what makes people brave.
A skill tree is a set of techniques organized into a logical progression that shows which ones someone should learn first and how to move from easy ones to harder ones. This most often appears in games, but applies to many other situations such as card magic or computer programming. It is much easier to acquire new skills in this format, because you don't frustrate yourself trying to learn things that are too hard or waste time on things you don't need. SkillBonsai is one example of a site that offers skill trees on different topics. Some people find it helpful to view life as a game, such as this strategy guide.