Welcome!

After 20 years at the University of Oregon, I have retired. So, I will begin posting about my new experiences here and hope you find them interesting. Note to spammers. All comments on this blog are moderated. If you attempt to leave any comments with links it will be deleted! So please, don't waste your time or mine!!

Monday, December 12, 2016

A Smart Home For Christmas

by Mary Harrsch © 2016

Back in 1995, Microsoft introduced an interactive help utility for Windows 95 called "Bob".  I was probably one of the few professional technology people who actually used "Bob" (As a dog lover I selected the helper incarnation called "Power Pup" though.)  "Power Pup" would keep track of my keystrokes as I worked in different applications and offer procedural advice on what it perceived I was trying to do at the time, prefaced by a little bark and a wag of his tail.  I found "Power Pup's" suggestions often useful and his friendly interaction a welcome break from the stress of administering a college-wide multiplatform local area network.  But, apparently, many of my colleagues thought he was "too cute" for the serious work of computing and "Bob" was relegated to the dustbin of failed products in fairly short order.

But I did not forget "Bob" and how artificial intelligence could be used to improve productivity while reducing social isolation.  So I began to experiment with conversational agents that utilized natural language programming coupled with knowledgebases to provide a more friendly computer-to-human interface.  With my interest in history, I decided to try to virtually recreate historical figures from the past that could converse about their culture with modern interested humans.  This resulted in the creation of a virtual Julius Caesar that was online for several years before I retired.  Caesar would answer questions about ancient Roman culture posed to him by visitors entering their questions in a text box.  He could give a textual answer or display related websites or online videos.  As text-to-speech technology advanced, I even experimented with software that would enable Caesar to answer questions verbally and explored voice recognition technology to see if it was viable for user input as well. But, when I retired I no longer had access to the server platforms needed to support projects like Caesar.  However, my interest in natural language programming and more friendly human-to-computer interfaces endured.

So, I followed the development of Amazon's Alexa-powered devices with a great deal of interest. But, I'm a rather pragmatic individual and at first so much marketing emphasis was placed on Echo's music management features that I wondered if there were more useful applications for a busy 21st-century household.

Then I began reading about wifi-enabled electrical connection accessories that could be managed with Alexa-enabled devices and thought about how convenient it would be to be able to turn on and off groups of lights and appliances with a few words rather than going around physically flipping switches. But spending almost $200 per device and the need to have a device in each main room was still an expensive proposition to gain a little convenience.

Then Amazon introduced the Echo Dot coupled with a holiday sale price of just $39.95 and I found resistance was futile as my Star Trek friends would say.  I was still a bit concerned about the accuracy of the voice recognition, though. So I started out with just one Echo Dot for the living room along with a couple of Wemo wifi-enabled plug adapters for the two main living room lamps.

I downloaded the Alexa app to my iPhone and discovered the Echo Dot setup was a breeze.  I opened the Alexa app, then opened Settings and changed my iPhone wifi connection to the detected Echo network and configured the detected Dot. Alexa also did not seem to have any problem understanding me.  I read through all of the "Try this" examples and began to configure some of the built-in features.

I really liked the "flash briefing" feature that lets you select specific news feeds for a personal news update which you can request at any time.  I selected NPR radio, BBC News, Tech Crunch and CNet (for technology news) and Discovery (for science news) as my personal news sources.  I also added the local weather forecast and the Alexa Try This feed.  Although I live in Oregon's Willamette Valley, I couldn't find any local news feeds but I think I'll add feeds from Seattle and San Francisco to at least hear major stories from the Pacific Northwest.

I also read that Alexa could interface with Google Calendar and keep you appraised of upcoming appointments.  I hadn't used an online calendar since I retired but knew how helpful this would be, especially when managing complex medication schedules and medical appointments.  So I configured my Google Calendar and paired it with Alexa.  Now, each morning after requesting my Flash Briefing, I ask what's on my schedule for the day and Alexa tells me.

I've also used the Google Calendar to keep track of upcoming programming on PBS that I may wish to record.   PBS sends me a physical schedule of their upcoming programming for a full month but at present, my DISH Hopper cannot see more than two weeks of scheduled programming at a time. Now, when I get my PBS schedule, I enter the programs I wish to record into my Google Calendar and Alexa lets me know each day if any are on that day.  I can then set my DVR to record them.

But the real "killer" app I was looking for turned out to be Alexa's Shopping List!  It never seems to fail that I realize I need something from the grocery store when I'm not in the kitchen where I keep my shopping list.  As I've gotten older my short term memory is not what it used to be either and it is not uncommon for me to forget what I was thinking about just a few minutes later as I walk from one room to the next.  So, imagine how helpful it is to be able to tell Alexa to add something to your shopping list as soon as you think of it regardless where you are! Of course, that meant I needed to add Echo Dots in my bedroom and the living room, too, which I promptly did. To access my shopping list once at the grocery store I just open the Alexa app on my iPhone and check off and delete each item as I add the item to my physical shopping cart.

Although my husband has the television blaring all day long, I did find a nice use of the music management features of Alexa.  Now that I have an Echo Dot in the bedroom, I can run a warm bath in the adjoining bathroom (Alexa's range is up to 20 feet), lay back in the tub and tell Alexa to play one of my favorite playlists from my Amazon Music account.  I did have a few hiccups configuring my Amazon playlists to work with Alexa, though.

I had already imported most of my music from my iTunes library to my Amazon Music account.  I had also set up playlists previously.  But Alexa did not seem to recognize my playlist names and would offer something from Prime Music (since I'm a Prime member) using my spoken words as a search guide. I ended up calling Alexa tech support and learned that Alexa does a better job of recognizing playlists if you name them "Your Name" then "Description".  For example, I had a playlist named "Holiday favorites".  I renamed it to "Mary's Holiday Favorites" then Alexa recognized it and played it for me.  That solved most of my playlist issues.  There were a few words, however, Alexa seemed to insist on using for search terms.  So, I experimented with different descriptions until she properly recognized the list.  I had a list named "Sentimentals".  I initially renamed the list to "Mary's Sentimentals" but Alexa still loaded some other Prime Music.  I renamed it again to "Mary's Mood Music".  Alexa still did not interpret it correctly.  So I finally renamed it to "Mary's Soft Rock" and Alexa now recognizes it.

When I received my Wemo wifi-enabled plug adapters for my living room lights, I realized just how powerful having a "Smart Home" would be.  Our living room does not have any overhead lighting so all lighting is provided by individually controlled lamps.  Each night I have to go around and turn each lamp on or off.  But, by connecting them with my Echo Dot, I now simply say "Alexa, Living Room On" and the entire living room lights up.

I had to first download and install the Wemo app onto my iPhone. Then I opened the Wemo app on the iPhone, changed my iPhone wifi connection in Settings to the detected Wemo network and configured each plug adapter.  Then I opened the Alexa app, selected the Smart Home option and grouped the two detected Wemo plugs into a "Living Room" group.

I hope to eventually replace some of my wall switches with wifi-enabled switches too since I have porch lights on different circuits in different parts of the house.  I would like to tell Alexa to turn on the porch lights and have all of them on at once without traipsing from room to room whenever I need to go outside after dark or have visitors arrive after dark.  I did read about a gotcha, though. I learned that many wifi-enabled switches require a neutral wire that was not normally included in wiring installed before 2011.  However, I have researched this issue further and it looks like there are switches out there that do not use the neutral wire.  I just have to be sure they will work with our home's wiring configuration.

I recently learned about a new app for Alexa called "Ask My Buddy" too.  It enables you to send a text, email or phone call to up to five family or friends if you need to alert them about a problem like you are incapacitated and cannot reach a phone.  It's sort of like "Life Alert" without the automatic 911 call or monthly subscription fee.  I wish it would allow you to send an individually specified text message but it only sends a message saying you need help.

Update 1/1/2017: I discovered another Alexa skill named "SMS with Molly".  It lets you send a short text message to someone in your preconfigured contact list by saying "Alexa, ask SMS with Molly to text Margaret "I'm Home Safely".  You need to sign up for a free account with SMS with Molly, add your contacts to your contact list then enable the skill using your Alexa App.  I think this app would be really helpful for seniors living alone that wish to let their family members know they are OK each day.

I've also decided to try the timer feature and see if I can get Alexa to verbally remind me to take my medicine at noon.  Most of my medications are taken in the morning or at bedtime and I have no problem remembering them as they are part of my morning and bedtime routines.  But when I get busy preparing lunch I sometimes forget to set my noon medication by my water glass so I take it with my meal.

Update 1/1/2017: After reading up on Alexa's timer and alarm functions I learned that timers are designed for one-time use while alarms can be set to be repeating.  So I set an alarm for noon each day and selected a pleasing alarm tone.  I wish it would let you specify a short text string that Alexa could read to you using her text-to-speech capability but at present it doesn't.  For my present needs, a tone is okay as there is only one thing for me to remember at noon.  However, for someone with more complicated medication schedules, it would be really helpful to have Alexa sound a tone followed by a short reminder message.  Hopefully, Amazon's engineers will enhance the alarm function soon.

So, my Echo Dots with Alexa are now very much an integral part of my day.  When I get up in the morning I say "Alexa, Living Room On" and the lights go on in the living room.  I walk in and sit down and say "Alexa, my Flash Briefing please".  I then listen to the news and get the latest weather forecast for the day.  Then I say "Alexa, what's on my schedule today?" and she tells me whatever I have scheduled in my Google Calendar".  As I prepare a meal and notice I'm getting low on coffee I say "Alexa, add Coffee to my Shopping List" and she tells me she has added coffee to my shopping list.  I drive to the grocery store and open the Alexa app on my iPhone select shopping list from the menu and load my cart.  At noon while I am preparing lunch, Alexa sounds a tone to remind me to set out my noon medication. In the evening, I take a warm bath to relax and tell Alexa to play "Mary's Soundtracks" and listen to my favorite movie music while I'm soaking.  Then when I'm ready for bed I say "Alexa, Living Room Off" and Alexa turns off the living room lights. I'm sure I'll find other useful applications, too, as more "skills" are developed by Amazon and third parties as well. I have a feeling this is just the beginning!






Thursday, October 6, 2016

Lifelong Trekker celebrates 50 years at Chicago Trek Fest

by Mary Harrsch © 2016

My son Ben (left) and I (right) share a moment with
actor Sean Kenney (Center) who portrayed a disfigured
Captain Christopher Pike in the classic Star Trek
episode "The Menagerie".
Since this year marks the 50th anniversary of the premiere of the original Star Trek television show, I traveled to Chicago to attend a 50th anniversary Star Trek celebration (after I couldn't get a ticket to the Las Vegas Star Trek gathering even though I tried to obtain a ticket nine months before the event!!).  After 50 years, I finally got to meet William Shatner, the original Captain James T. Kirk himself (Guaranteed by the purchase of a silver ticket of course!)

I have been a Trekker since the very first show.  In fact, some of my high school friends got really angry with me because I belonged to the Pep Club and Junior Varsity games were played on Thursday nights and I was expected to attend.  But when Star Trek was announced, I stayed home to watch "The Man Trap", the very first Star Trek episode broadcast in September 1966 and never attended another Junior Varsity game after that.  I was hooked and even made plans to major in biochemistry at the University of Chicago so I could work at NASA's Ames Research Center and search for extra-terrestrial life. But, life dictated another course and I wound up as an educational technologist instead.  At least I worked with computers like those depicted in Star Trek and were pure science fiction during its broadcast run.

I watched the first two seasons then got married and didn't see the episodes of Season 3 until Star Trek went into syndication.  When my son was born I would rock him to sleep while watching Star Trek episodes aired in the afternoons on the local TV station.  As it turns out, Ben, who now lives outside of Chicago, actually went with me to this Star Trek convention.  My love of the show must have worn off on him!

I was really excited when Star Trek: The Next Generation (STNG) was broadcast in the 80s, followed by Deep Space Nine, Voyager and finally Enterprise although family responsibilities often interfered with my ability to watch these subsequent shows. I always took a day off from work to attend the opening of each Star Trek movie, though.

I also explored "Star Trek: Federation Science" at the Oregon Museum of Science and Industry (a traveling exhibit ending in 2002) where I got to beam down to a planet as a member of an away team (My husband said he couldn't believe we stood in line for 2 1/2 hours for that!).  I actually got to sit in the captain's chair on Captain Picard's bridge at the Hollywood Entertainment Museum (sadly closed in 2007).  Then I jumped at the chance to attend Comdex (a huge technology trade show in Las Vegas) held at the Hilton where "Star Trek: The Experience" was installed (closed in 2008) and got to be accosted by a garrulous Klingon in Quark's Bar!  The Klingons must have it in for me because I ran into a couple more at the U.S. Space and Rocket Center in Huntsville, Alabama, too!

Then in the 90s, I finally had a chance to attend my first Star Trek convention held right here at the Hilton Hotel in Eugene, Oregon!  Michael Dorn (Worf) was the featured guest and the Hilton was so packed the fire marshal was having a fit!  When Michael Dorn plays Worf he lowers his voice dramatically and, as Worf, he recited the line he delivered in which he professed his love for his half-Klingon wife in a recent episode of STNG and the crowd went wild!

By the time I attended the next convention featuring George Takei (Sulu), I even fashioned a slightly modified version of the Star Fleet uniform featured in Star Trek II: The Wrath of Khan and wore it, although I was too embarrassed to enter the costume competition!

At these Creation-sponsored conventions, there was always something going on - screenings of music videos, bloopers, contests, auctions, etc., The vendor hall was packed and the auctions included some really high-end collectibles. It was a lot of fun and everyone enjoyed themselves even if you didn't spring for the gold or silver reserved seats.  Everyone got a chance to meet the keynote speaker.  You just might have to stand in line for quite a spell to do it.

Sadly, I was to discover those attributes have become a thing of the past. I found the Chicago convention poorly organized and intentionally engineered to limit attendance - totally baffling to me considering the size of Chicago.  Furthermore, Creation Entertainment sponsored two conventions just across the street from each other on the same days - one for "Supernatural", a currently broadcast show in its 11th season with a decidedly younger fan base, and the Star Trek the 50th Anniversary tour.  Irritatingly, the Star Trek convention, with many older attendees, appears to have gotten short shrift.  Whoever was running the cameras did a terrible job, the microphones and sound systems were erratic and many of the top celebrities like Jonathan Frakes (Commander Riker from STNG) and Brent Spiner (Commander Data from STNG) supposedly canceled at the last minute.

We were told Frakes had to finish directing the last episode of "The Librarians" for this season.  I'm sure that had to have been known for some time, but we were not told until the morning of preregistration, probably to prevent cancellations and demands for refunds.  Furthermore, although I no longer have access to the extensive list of celebrities purportedly attending the show that I saw prior to my ticket purchase almost nine months ago, I remember it being far longer than the roster of those who actually showed up.

 At this convention there was less than a dozen vendors although I managed to find an authentic Tribble complete with action sounds and some collectible Star Trek Hallmark ornaments I'd never even seen before. The auction only included signed photographs or display banners except once when one of the volunteers was draped with various T-shirts (a tactic I remember from years ago). There were no blooper reels or music videos except for an amateurish video submitted by a fan. Apparently, on the convention website they had announced a fan music video contest but only had one taker.  I wish I would have seen that.  I think I've learned enough about ProShow Gold that I could have put together something!

Autographs and photo ops were all "pay to play" activities charging such "modest fees" as $25 - $100 each depending on the celebrity plus the cost of whatever it is you are having signed.

There were Q & A sessions with the appearance of each guest but "planted" questioners during the Q&A sessions - awfully similar to "reality" show productions.  Maybe Creation figured us old geezers were too old to notice.

There was a costume contest like in the old days but most people wore the standard Star Fleet costumes you can purchase online.  There were several outstanding exceptions, though.  A young college girl dressed up as Commander Data in the 19th-century ship's officer uniform he wore in the opening holodeck scene of the feature film "Generations".  She had her face made up with white makeup and her hair pulled back and looked so much like Data it almost made up for Brent Spiner's absence!  If I had been able to find a 12" action figure of Data dressed that way I would have bought it and had her sign it!  There were two men dressed as Khan Noonien Singh, one from the Classic Trek Episode "Space Seed" and the other from the feature film "Star Trek II: The Wrath of Khan".   There were several well-done Andorians and a couple of Klingons as well.  Two people even came as hortas, the silicon creatures from the Classic Trek episode "The Devil In the Dark!"

A young college woman (Center) dressed as Commander Data in the opening sequence of the Star Trek feature film "Generations".  Photo courtesy of  "The Chicagoist"
The Klingon Empire was actually pretty well represented.  Guest celebrities included Michael Dorn (Worf), the actors who portrayed Gowron and Martok from Deep Space Nine and Suzie Plakson who played K'Ehleyr, Worf's half-Klingon mate in the STNG episodes "The Emissary" and "Reunion".  Plakson also played a Vulcan doctor on the STNG episode "The Schizoid Man", a female Q on the Voyager episode "The Q and the Grey"and the Andorian Tarah on the  "Enterprise" episode "Cease Fire".  Michael Dorn's presence sort of brought me full circle back to the very first Star Trek convention I ever attended.  I smiled thinking about that as this convention may be the last I will ever attend.

Other guests included Gates McFadden who played Dr. Beverly Crusher on STNG (being a dancer she looked fantastic by the way!), Marina Sirtis who played Counselor Troi on STNG and Robert Duncan McNeill who played Tom Paris on Voyager.  McNeill now wears glasses and I honestly didn't recognize him.  He said it was the "Clark Kent effect!"

Nana Visitor (Kiera), Rene Aberjonois (Odo), Armin Shimmerman (Ferengi Quark), Max Grodenchik (Ferengi Rom) and Jeffrey Combs (Andorian Shran) represented "Deep Space Nine".  Jeffrey Combs also  played many different aliens on DS9, Voyager, and Enterprise.

Armin Shimmerman, who played the Ferengi bar keep Quark on Star Trek: Deep
Space Nine explains how a performance on the DS9 set had to be DLP
(Dead Letter Perfect).  Photo courtesy of  "The Chicagoist"
I was particularly impressed with Armin Shimmerman's presentation.  He now teaches Shakespeare and treated us to a little bit of Henry VI Part 2 although, to be honest, his depiction of a hunchback made me, like a lot of other people, think it was Richard III.  I was really surprised to learn from Mr. Shimmerman that, although there was hijinx on the sets of Classic Trek and STNG, by the time Deep Space Nine was put into production, Paramount ran the production like a well-oiled automotive assembly line.  Each performance had to be DLP - Dead Letter Perfect!  There was no ad-libbing allowed.

Rene Aberjonois played Security Chief
Odo on "ST: Deep Space Nine" and
Paul Lewiston on "Boston Legal"
Image courtesy of Kyle Cassidy.
I understand from Rene Aberjonois's presentation this stringent adherence to the script even applied to scenes where the actor was to cough or clear their throat.  Apparently, actors that could not perform at this level of precision didn't last very long on a Star Trek set or on the set of Boston Legal either. That really floored me as Boston Legal's story lines were often zany but they were apparently very precisely zany!

Sean Kenney as Captain Christopher Pike in "The Menagerie".
Image courtesy of Wikimedia Commons
I was really glad to see Sean Kenney at this convention.  Sean played the disfigured Captain Christopher Pike in the Classic Trek episode "The Menagerie".  I found a Star Trek collectible Hallmark ornament in one of the vendor booths depicting him and he autographed it for me and had someone take our picture together with my camera.  So between Sean and William Shatner I technically got two captains' autographs!

An 85-years young William Shatner still with a twinkle in his eye!
Photo courtesy of  "The Chicagoist"
Shatner, of course, was the consummate showman and every bit the star of the show.  He talked about all of the projects he is currently involved in including a very physically demanding upcoming equestrian competition in Las Vegas, then discussed the importance of friendship.  He said it was particularly hard for actors to make close friends because they are always bouncing from job to job.  I had never really given much thought to the "piecework" nature of acting before.  When it came time for him to sign my Hallmark ornament depicting him in the episode "Trouble with Tribbles", one of his assistants pointed to the front edge of the ornament asking if this would be a good place for him to sign and I asked "Are you sure he can sign in such a small space?" whereby Shatner, with his eyes twinkling, told me "Why I could sign on the head of a pin!"      

Tuesday, May 17, 2016

Tricky rewards programs mislead shoppers!!

Last week I read an article about the best credit card rewards programs to use for maximum benefits. One card that the article listed as one of the best programs was the American Express Blue Cash Preferred Card because it pays 6% cash back on groceries.  It charges a $75 annual fee but with the rewards for groceries so high I thought I would still come out way ahead by using that card.  So I applied and was approved for the card and received it last night in the mail.  This morning when I called to activate the card I asked the account representative if Walmart, where I buy most of my groceries, was considered a grocery store?  "Oh, no," she replied, "Walmart is considered a superstore like Costco or Sam's Club so rewards would be 1% cash back. By grocery stores we mean Kroeger's or a chain like that."  I told her that those chain grocery stores charge as much as 50% more for groceries than Walmart so I would lose money by shopping there as opposed to saving money that was my original goal for acquiring the card.

For example, my husband and I like Grandma Sycamore's bread made by Sara Lee (sorry, no grandmother involved!).  I can purchase a loaf of that bread at Walmart for $2.50 or less.  At the local Albertson's (that would qualify for the 6% cash back) I would have to pay $3.79 for the identical loaf of bread.  That's 51.6% more for the same bread.  I'm afraid the 6% cash back wouldn't begin to cover the extra cost of shopping there.

I asked the American Express customer service representative to cancel the card and she was very gracious and did so without any high pressure tactics to keep the card.  After I hung up I thought about the annual fee and wondered if I would be charged the fee even though I cancelled the card on my activation call.  So, I called back and got a nice young man who assured me the card was cancelled and I would not be billed for the annual fee.  I wish all customer service representatives were as nice as those that work for American Express.  It's just too bad that the card will not save any more money for anyone who shops for groceries at Walmart.than a typical cash back card without an annual fee.

So, I guess the moral of this story is you have to be particularly vigilant if you are trying to save money using a rewards program that specifies different cash back amounts for different categories of purchases.  Although the author of the article did mention the card would not pay 6% at Costco or Sam's Club they did not mention Walmart that is considered by many people a "regular" grocery store since no membership fees are charged.  At least the Citicard Double Cash Rewards card pays 2% cash back for any purchase regardless of the source.

Saturday, January 16, 2016

Extending the Learning Environment: Virtual Professors in Education

A technology resource article by  © 2005

For those of you interested in artificial intelligence development, here is an archive copy of a presentation I gave in 2005 (I'm consolidating my online contributions!)



Extending the Learning Environment: 
Virtual Professors in Education

By Mary Harrsch
Network & Management Information Systems
College of Education, University of Oregon
[2005]

Six years ago [1999], my sister was telling me about a fascinating History Alive Chautauqua event she had attended near Hutchinson, Kansas.  The program brings a reenactor portraying an historical figure into schools and communities for an educational presentation and question and answer session.  I thought to myself, “It’s too bad more people can’t take advantage of such a unique learning experience.”  Then, the technologist within me began to wonder if there was a way to create a virtual Chautauqua experience online.  As I pondered this possibility, I realized that if I could find software that could be used to create a “virtual” person online, I could not only recreate the experience of the Chautauqua, but provide a tool faculty could use to answer course-specific questions.  It could even be used to provide information about the professor’s personal interests and research to enhance the sense of community within the learning environment.

My quest led me to a website that included links to a number of different software agent projects.  I learned that the type of agent I needed was commonly called a “chatterbot”.  The first “chatterbot” was actually developed long before the personal computer.  In the early 1960s, Joseph Weizenbaum created “Eliza”, a virtual psychoanalyst.

In his efforts to create a natural language agent, Weizenbaum pointed out that he had to address the technical issues of:

  • the identification of key words,
  • the discovery of minimal context,
  • generation of responses in the absence of keywords

As I began to explore different agent implementations, I found that, in addition to these issues, the application needed to be able to prioritize keywords to discern the most appropriate response.  Several agents I evaluated, including Sylvie, a desktop assistant, developed by Dr. Michael ("Fuzzy") Mauldin, Artificial Life’s Web Guide , Carabot 500 developed by U.K. company, Colorzone,  and Kiwilogic’s Linguibot, used slightly different methods to set the priority of subject keywords to select the most appropriate responses.  The response with matching keywords under the subject with the highest level setting was “fired” – displayed to the user.  However, when editing their script files, I found keeping track of subject priorities was challenging.

Another problem with many script-driven agents I evaluated was the use of left-to-right parsing sequences that did not compensate for a variance in the order of keywords in a question. Each query had to be evaluated for subject and for matching character strings, based on left-to-right word order with the use of various “wildcard” characters to indicate placement of keywords within the overall question.  Therefore, you often had to have multiple script entries to compensate for different word order.  For example, if a student asks “How do I change my password in e-mail?” you would need one script entry. If the student asks “How do I change my e-mail password?” a different script entry would be required:

* email * * password * as well as
* password * * email * to trap for either wording.

Although this attention to script design resulted in improved response accuracy the scripting knowledge required for these agents was not something I would expect a faculty member to have the time or desire to learn.

A third problem with several of the agent applications I used was the necessity to unload and reload the agent each time the script file was edited.  If students were actively querying the agent, you could not save any script changes until the script file was no longer in use.

When I invested in the Enterprise Edition of Artificial Life’s WebGuide software, I also realized the importance of a logging feature that I could use to study and improve my guide’s responses.   I recognized the importance in a virtual tutoring environment of having the ability for a student to print out a transcript of their tutoring session for future study.  Not only was this feature absent in the agents I evaluated, but the responses produced using Javascript or Flash would not allow the user to highlight and copy responses to the clipboard either.

One day, I explored UltraHal Representative, developed by Zabaware, Inc. I liked the ability Ultrahal provided to program the agent through a web interface.  It had the capability to include links to related information.  It could be customized with personalized graphics.  It logged interactions.  Best of all, it had a straightforward approach to editing - no scripting – just type your question three different ways then type your intended response. 

But, I soon discovered that without the ability to identify keyword priority, I found that the results using whatever algorithm was built into the agent engine were too inaccurate for a virtual tutoring application.

I needed a product that could be programmed to be “omniscient”. 

“Effective ITS require virtual omniscience -- a relatively complete mastery of the subject area they are to tutor, including an understanding of likely student misconceptions.” (McArthur, Lewis, and Bishay, 1993)

I needed a virtual professor that could be “programmed” by real professors, the individuals who would have a mastery of the subject and an understanding of student misconceptions. But all of the chatterbots I had encountered, so far (with the exception of Ultra Hal), required knowledge of scripting that most faculty members do not have the time to learn.  I would not have the time to provide one-on-one time with faculty developers and paying a programmer to work with a faculty member is also too expensive.  (I noticed most developers of commercial agents actually relied on the scripting needs of their clients for their primary revenue stream.)  So, I decided to attempt a radically different approach to agent design.

I am an experienced Filemaker Pro solutions developer and one day I was reviewing some of Filemaker’s text functions and realized that the position function could be used to detect key words in a text string.  The beauty of the position function is that the keyword can be identified anywhere within the target text.  It is not dependent on a left to right orientation.  Filemaker is also not case sensitive.  Also, the new version 7 allows most scripting calls for text processing to be used with their Instant Web Publishing interface. I realized this would greatly simplify web integration.

So, reviewing my experiences with the agent applications I had used, I developed a list of features that I wanted to incorporate:

Web functionality:
Multiple agents controlled by a single administration console
Web-based query interface
Web-based editing interface
Multiple graphic format support
Web accessible logging function for both agent editor and student user
Ability to display related resources

Query processing functionality:
Question context awareness (who, what, when, where, why, how, etc)
Ability to weight keywords by importance without user scripting
Ability to return an alternate response if a question is asked more than once
Ability to use one response for different questions
Ability to process synonyms, international spelling differences, and misspellings
Independent of word order
Not case sensitive

Structural Design:
Modular design to enable importation of knowledge modules developed by others
Agent specific attributes to customize the interface and responses such as a personal greeting, the opportunity to use the person’s homepage as a default URL, information about area of expertise and research interests for alternative agent selection criteria, custom visual representations, etc.

I began by designing my response search criteria.  I programmed the agent search routine to categorize responses by the first word of the query – usually What, Where, Why, How, Who, Did, Can, etc. to establish the question context. Then I used position formulas to test for the presence of keywords.  I then developed an algorithm that weighted the primary keyword or synonym and totaled the number of keywords found in each record.

I designed the search function so that when the visitor presses the button to ask their question, the database first finds all responses for the question category (who, what, when, etc.) containing the primary keyword (or its synonym).  Responses are then sorted in descending order by the total sum of keywords present in each response.   The first record – the one with the most keyword matches – is displayed as the answer. 

If there are no category responses containing the primary keyword, then a second find will execute to look for all responses with the keyword regardless of category.  In working with other agent products, I have found that if you return a response with at least some information about the keyword, even if it is not an exact answer to the question, the student assumes the agent recognized their question and may learn auxiliary information that is still helpful to them.

For example, if a visitor asks my virtual Julius Caesar if he really loved Cleopatra, he will answer “Cleopatra…ah, what an intriguing woman.”  Not only is this more in character with Caesar (most of his female dalliances were for political reasons) but the answer could also be appropriate for a different question, “What did you think of Cleopatra?”  My search routine would find it in either case because of the weighting of the primary keyword, Cleopatra.

If there are no responses containing the primary keyword, a third find looks for any generic category responses.  For example, if a student asks who someone is and you have not programmed your agent with a specific answer for the keyword (the person they are asking about), the agent will reply with an appropriate “who” response such as “I’m afraid I’ve never made their acquaintance.” 

If a student’s question does not begin with any words set as category words, the last find will return a generic “what” response such as “I may be a fountain of knowledge, but I can’t be expected to know everything.”  Programming the agent with default generic responses, ensures that the agent always has something to say, even if it knows nothing about the subject.  I developed a small database of generic responses for each question category that is imported into an agent database each time a new agent is created.  The faculty member can go through the responses and edit them if they wish.
Next, I turned my attention to the faculty’s content editing interface.  I wanted the faculty member to enter a proposed question, designate a primary keyword and synonym, supply any other keywords they thought were important to identify more precisely the desired response, and the desired response.  

I also provided a button that enables a faculty member to quickly generate a different answer for the same question or a different question for the same response.  

I created a field that is populated with a different random integer on each search.  By subsorting responses by this random integer, it enables the agent to offer a different response to the same question if the question is asked more than once.  This supports the illusion of the agent being a “real” person because it will not necessarily return the same identical response each time. 

“Believable agents must be reactive and robust, and their behaviors must decay gracefully. They must also be variable, avoiding repetitive actions even when such actions would be appropriate for a rational agent. They must exhibit emotion and personality. Ultimately they must be able to interact with users over extended periods of time and in complex environments while rarely exhibiting implausible behaviors.” – Dr. Patrick Doyle, Believability through Context: Using “knowledge in the world” to create intelligent characters

With the “engine” of my agent developed, I turned my attention to the visual representation of the character.  In their paper, The Relationship Between Visual Abstraction and the Effectiveness of a Pedagogical Character-Agent, Hanadi Haddad and Jane Klobas of Curtin University of Technology, Perth, Western Australia, point out the divergent views of information systems designers outside the character-agent field with those developers within it.

Wilson (1997) suggests that more realistic character-agents may introduce distraction associated with the user’s curiosity about the personality of the character and overreading of unintended messages because of presentation complexity.”

Unlike detailed realistic drawings, sketches help focus the mind on what is important, leaving out or vaguely hinting at other aspects. Sketches promote the participation of the viewer. People give more, and more relevant, comments when they are presented a sketch than when they are given a detailed drawing. A realistic drawing or rendering looks too finished and draws attention to details rather then the conceptual whole (Stappers et al, 2000).

“On the other hand, research by psychologists suggests that people may put considerable cognitive effort into processing abstract representations of faces (Bruce et al. 1992; Hay & Young 1982). It is possible, therefore, that response to anthropomorphised character-agents, and especially their faces, may differ from responses to sketches. Gregory and his colleagues (1995) conducted studies on human response to faces at the physiological level. They demonstrated that humans are particularly receptive to faces. In terms of recognition, participants in their studies were more responsive to real faces than to abstracted line faces. They speculated, however, that people spend longer studying abstracted line faces and may find them more interesting (Gregory et al. 1995). If this is so, then contrary to theories of information design, an abstract face may introduce more distraction into the communication than a realistic face.”

Filemaker Pro 7 provides multimedia container fields that enable me to include still images, animations, or even video clips.  However, not only is creating a unique graphic for each response time consuming, motion video files can be quite large and slow down the delivery of response information over the web.  Working with other agents, I had noticed that just the slight eye movement of a blink can be enough to reinforce the illusion of a sense of presence. This approach straddles the two opposing theories described above.  I would utilize a real face to capitalize on the human receptivity to a real face but keep animation to a minimum to reduce distraction.  I also think the use of a real faculty person’s face serves to reinforce the bond between the instructor and the student. A blink is also very easy to create from any faculty portrait.

I use an inexpensive animation tool called Animagic GIF Animator.  I begin with a portrait of the faculty member.  I open it in Photoshop (any image editor would suffice) and, after sampling the color of the skin above the eye, I paint over the open eye.  Then I open an unedited copy of the portrait in Animagic, insert a frame and select the edited version of the portrait.  I then set the open eye frame to repeat about 250 times and the closed eye frame to repeat once.  Then loop the animation.

I created a related table that stores all unique information about each agent including their default image, their default greeting, their login password, their area of expertise, their email address and their homepage URL. I also developed a collection of alternate avatars to use for agent images in case some faculty were camera-shy.  These were created with Poser using their ethnic character library.

Finally, I designed the login screen where the student selects the tutor to whom they wish to converse.  Upon selecting the tutor and pressing the button “Begin Conversation”, the student is presented with the query screen including the individual greeting for the tutor selected.  

I also provided a button for the faculty to use to login to edit their agent.  It takes them to a layout that prompts them for a name a password. 

Famed World War II cryptographer, Alan Turing, held that computers would, in time, be programmed to acquire abilities rivaling human intelligence.

Alan Turing at age 16.
“As part of his argument Turing put forward the idea of an 'imitation game', in which a human being and a computer would be interrogated under conditions where the interrogator would not know which was which, the communication being entirely by textual messages. Turing argued that if the interrogator could not distinguish them by questioning, then it would be unreasonable not to call the computer intelligent.” – The Alan Turing Internet Scrapbook 


My virtual professor may not be as sophisticated as agents that have been developed to pass the Turing Test but I hope I have provided a framework for the development of a rigorous inquiry-based learning system.

“Effective inquiry is more than just asking questions. A complex process is involved when individuals attempt to convert information and data into useful knowledge. Useful application of inquiry learning involves several factors: a context for questions, a framework for questions, a focus for questions, and different levels of questions. Well-designed inquiry learning produces knowledge formation that can be widely applied.” - Thirteen Ed Online.

References:

McArthur, David, Matthew Lewis, and Miriam Bishay. "The Roles of Artificial Intelligence in Education: Current Progress and Future Prospects".  1993.  Rand. <http://www.rand.org/education/mcarthur/Papers/role.html#anatomy >.
Doyle, Patrick. "Believability through Context Using "Knowledge in the World" to Create Intelligent Characters." Trans. SIGART: ACM Special Interest Group on Artificial Intelligence. International Conference on Autonomous Agents. Session 2C ed. Bologna, Italy: ACM Press    New York, NY, USA, 2002. 342 - 49 of Life-like and believable qualities.
Haddad, Hanadi, and Jane Klobas. "The Relationship between Visual Abstraction and the Effectiveness of a Pedagogical Character-Agent." The First International Joint Conference on Autonomous Agents & Multi-Agent Systems. Bologna, Italy, 2002.

Wilson, M. "Metaphor to Personality: The Role of Animation in Intelligent Interface Agents." Animated Interface Agents: Making them Intelligent  in conjunction with International Joint Conference on Autonomous Agents. Nagoya, Japan, 1997.

Stappers, P., Keller, I. & Hoeben, A. 2000, ‘Aesthetics, interaction, and usability in
 ‘sketchy’ design tools’, Exchange Online Journal, issue 1, December, [Online],
[2004, August 3].

Bruce, V., Cowey, A., Ellis, A. W. & Perrett, D. L. 1992, Processing the Facial Image.
 Oxford, UK, Clarendon Press.

Hay, D.C., Young, A.W. 1982, ‘The human face’, in Normality and Patholgy in
 Cognitive Function, Ellis, A.W. ed., London, Academic Press, pp. 173-202.

Gregory, R., Harris, J., Heard, P. & Rose, D. (eds) 1995, The Artful Eye, Oxford
 University Press,Oxford.

"Thirteen Ed Online: Concept to Classroom".  2004.  Educational Broadcasting Corporation. 8/9/04 2004. <http://www.thirteen.org/edonline/concept2class/ >.

Hodges, Dr. Andrew. "The Alan Turing Internet Scrapbook".  Oxford, 2004.  (3/15/2004):  University of Oxford. 8/09/04 2004. <http://www.turing.org.uk/turing/scrapbook/test.html >.