Thing Home Production Josephine Home 

The Thing Growing - Intelligence for the Thing

Classes I think I need Example of the text file

tgBrain
 

decision image
The brain decides what action the thing will take next.

Sometimes the thing will do a string of actions - ie. emerge from box, do a freedom dance, cavort around user thanking him/her for releasing it.

Sometimes the thing will act, then analyse the user's actions to determine what to do next - ie. It is going to teach the user a dance. It must pause after each bit of the dance is taught to watch the user and see if they are doing the dance. Sometimes it will dance together with the user.

Each action (dance, pause)  of the thing is a discreet event, the duration of which can vary. A clock is started at the beginning of each event and the user is checked during each event. In general the user is checked to see if s/he is attentive (ie watching the thing), and to see if s/he is moving, (how fast, away from the thing, towards the thing). Additionally  the user may need checking to see if s/he is trying to do a specific step of the dance.

In this teachdance sequence, the thing will hum/sing as it dances. Counting into the dance and establishing a tune and rythm to set the movements to, should make it easier to monitor the user more exactly, since it will be possible to judge when they are starting and finishing (I hope).

The brain's app will have 4 stages:
 

get information from storystate, thing's emotional state and user info
make decision based on info and on its own state 
(eg. where it is in the teachdance sequence.)
order action movement of entire thing 
movement of thing's body parts 
sound 
specific user check 
start event clock tick tock
In the get information the brain checks :
 
The Storystate 
(tgStoryState)
This is where we are in the story. The story state will keep a timer that tells how long we have been in a particular section of the story. If we need to bring about a climax, the weight of the story state factor on the decision process will increase.
Emotional State 
(tgEmotions)
This is the thing's internal emotional state. It should in part depend on previous state the thing has been in, in part be subject to random patterns and abrupt change.
User Info 
(tgCheckUser)
Global User Info. 
This event User Info 
Specific Check
Information from the get information stage really gives the emotional tone the next action should have - ie is the thing happy/mad.

Make Decision

The app has to decide on a next action that fits the narrative as well as with reference to the emotions of the thing.

It therefore has to have a sense of the kind of actions it is currently engaged in.
I propose a series of states that will correspond to the various activities: eg EMERGEFROMBOX, TEACHDANCE, SULKANDHIDE etc etc.

The makeDecision function has to decide which localBodyMove to pick.
Each localBodyMove will have an appropriate sound attached to it.
Each localBodyMove will have several variations with different emotional weights.

So in a chooseLocalAction function we will check the thing's internal state, pick an appropriate action and assign an appropriate emotional level to the action.

***

*****

The make decision function also has to pick an appropriate Global Move for the thing.

The chooseGlobalMove will also use the states :

****

Order Action
 
  1. The brain will have a GlobalBodyDCS child - it will pass the information on the GlobalMove to this child.
  2. GlobalBodyDCS will have LocalBodyDCS children for each of its body parts. The brain will send the appropriate move to each body partDCS.
  3. The LocalBodyDCS for the head will have a sound child. The brain will send the appropriate sound file to the sound child.
  4. The brain will also tell the checkUser class if it has to activate any specific check on the user (is the user dancing the part of the dance s/he should be), and tell it to check this event generally
 
 StartEventClock
    Each action will have a duration.
    The Event Clock starts at the beginning and runs for the duration.
    The checkUser class uses it to know when an event has begun and ended
    The brain uses it to know when it can go through its app process again.
 
 


 tgActionStore

Will be a child of the brain.
It is a list of actions that the brain can choose from.

The eweight is sent to the actionStore which stores actions both in a sequence and as variations of the action with different emotional weights . So the actionStore has to access the appropriate action by its place in a sequence and its eweight or in the case of the reactions, simply by the eweight.
 
 

actionStore
sequence variations of different emotional weight
emerge emerge 
danceforjoy 
suckup
emerge 
danceforjoy 
suckup
emerge 
danceforjoy 
suckup
emerge 
danceforjoy 
suckup
dance dance1 
dance2 
dance3 
dance4
dance1 
dance2 
dance3 
dance4
dance1 
dance2 
dance3 
dance4
dance1 
dance2 
dance3 
dance4
observe observe1 
observe2 
etc
observe1 
observe2 
etc
observe1 
observe2 
etc
observe1 
observe2 
etc
react react react react react
 
actionStore has children which are the Action class.
I imagine a text file looking something like this:
  I need the action store to check through its children. If a child is named emerge, it makes an emerge list
and puts the child inside, it puts any subsequent emerge children into that same list. When it find a different name it makes another list and puts all eg danceofjoys in that list. All children in list will have an emotional weight, and they can be accessed according to that weight.


tgAction

Stores the entire action, comprising:

  1. the movement for each body part
  2. the sound that goes with it
  3. the duration time for the movement and sound
  4. a transition time - indicating how much time the localBodyMover should allow between finishing the current move and starting the next one
  5. an emotional weight, a flag telling us if its needs an appropriate userChecker to be implemented
The text file with all of the additional attributes filled in is like this. The action class has corresponding class members: a path for each body part must also be made and stored in a data directory: The parse functions fills the lists and class members with info from the text file and data directory of paths. Plus I guess - doing something with the sound file ????? which has its volume with it.

All these class members have to be accessible by the brain and info from them passed on where appropriate
ie. the movements are fed to the localBodyDCS's - headMovesList  to headDCS etc etc
ie. sound to sound ????
 
 tgGlobalBodyDCS
 
 will consist of a set of Global Moves for the body to do including:
    emerging from box
    running after user
    staying close to user
    jumping
    backing away from user
    swooping in on user
    running to hide under nearest rock
    etc

It must be aware of the user's position and head orientation
It needs to be able to avoid the rocks

The brain will tell it what action to take.
 

 tgLocalBodyDCS

pretty much like the old keyframeDCS or avatarDCS

needs to take in a list of positions, orientations and scales and move through them
in duration

when finished needs to interpolate  from last position of old list to first position of newlist
using the transitions time
 

 tgHeadFlasher

needs to know when the thing is talking

flashes a head object that lights up the head object and which has to therefore be loaded under the localBodyDCS along with the head

a good extension would be for this head flasher to change colors with the things emotions

tgCheckUser

keeps infromation about the user and interprets it!

Global Info

This event info Specific Checks
        each specific movement (ie of dance) will need a specific check
        needs to have a list of specific checks that correspond to movements
        needs to get info from brain on when to use what check
        needs to return a comply/fail info to the brain

tgEmotions

 emotions images
 Not very worked out!

I need the thing to flip from emotion to emotion and I'm thinking an emotion wheel  may work.

Actions are given an eweight between 0 and 360.
At 0 the thing goes from manic/high to angry
At 90 softens from manic to happy.

I'm thinking that it would be goods to visualize this for testing purposes - so that at any time we can see with a pointer where on the wheel it is.

What should move it?
        It's effected by say its last x moods (event by event)
        It's effected by the user's attitude
            too much user compliance makes it angry
            too little user attention makes it angry
            an impatient user makes it manic
            a tentative user makes it blue (depressed)

Not really sure how complicated it needs to be to convince us of the possibility of its mood swings.
 

tgStoryState

I'm not exactly sure if this needs to be a class to itself.

All its going to so is to keep global track of the state the story is in and a timer that tells how long we have been in that state.

It will also have durations for how long the state can last and will move the action onto the next state if things are lagging.

states are:
    OnThePlain
    InTheShed
    UserMeetsThing
    TeachDance
    PleasureStick
    TeachDance2
    ThingTantrum
    RocChase
    ThingReleaseUser
    TeachDance3
    WorldCrack
    etc.

I think the user will revisit the techdance state several times and so I make a note here that we also need to keep track of where the user has gotten to with the dancing. The Thing will pick up the teaching from where it left off.
 

tgRecorder

This is to record the motion tracked movement of the thing.

I need to be able to record four elements (head, body, two arms)

Then play those back and record four more (four tail pieces)

I need to work out a naming convention and a filing convention so I end up with

move_head.path
move_body.path
move_rarm.path
etc