Image Image Image Image Image Image Image Image Image Image

Bartlett School of Architecture, UCL

Scroll to top

Top

No Comments

The Role of Intuition in Designing Human-Robot Interfaces

The Role of Intuition in Designing Human-Robot Interfaces

“I can’t define a robot, but I know one when I see one.” [1]

More than a century ago, in 1898, Nikola Tesla first made use of radio signals to remotely control a miniature boat in a pond in Madison Square Gardens, New York [2]. The crowd that gathered to witness this spectacle could not comprehend how the controller interface was connected to the floating battery powered boat without wires. Radio signals were definitely not yet a part of the collective imaginary, and so people in the crowd claimed that the scientist was mind-controlling a trained monkey that was hidden inside the boat. The audience at Madison Square Garden turned to magic to explain something they could otherwise not fathom. Post-rationalizing the episode, it is noticeable how intuition is largely based on both collective and individual sets of knowledge. New-Yorkers had not seen anything like that before, so they could not associate radio-waves with anything else than magic. As a result of this mistaken interpretation, nobody in the audience could have ever piloted the boat without being properly instructed. Therefore it is vital and preferable that designers take a holistic approach towards designing human-robot interfaces.

With this dissertation I both claim and challenge that intuition is the base of effective and efficient human-technology interaction. In consumer electronics, the role of the UX designer and the role of the industrial designer are deputing to make the transition in between users and technology as fluid as possible. This dissertation will cover a very specific corner of the whole web of technology: robots.

1.1 Definition of Robot

The etymology of the word “robot” curiously encompasses the technical realm. In fact, it first appeared as a term in humanities with a play from Karel ÄŒapek in 1920 [3] , where the robots actually consisted of mind-controlled humans. From the Czech of the time “robota”, the word robot indeed meant “slave labour”. The theatrical aspect is kept alive by Bruce Sterling’s definition [4] of robots as tool for “dramatic effect”, not just mere pieces of technology. He placed the focus on the rituals and interaction produced when the robots engage with the user, denying the importance of the mechanics and electronics around it. With a more technical tone, obviously, the Robot Institute of America defined robot in 1979 as “A reprogrammable, multifunctional manipulator designed to move material, parts, tools, or specialized devices through various programmed motions for the performance of a variety of tasks” [5]. This definition makes it clear that there is a dualism between the abstract “program” of the robot and the physical “performance”. The “third way” of looking at robotics is introduced by Carlo Ratti. In the catalogue of the Hello Robot Exhibition [4] he makes things simpler by stating that robots are anything which possesses some form of intelligence, senses and actuates something else. Even though it might be a very broad definition, it interestingly puts a cellular phone and a smart kitchen in the same category. In all of these definitions, although, important aspects of robotics such as intelligence and self-programming are missing.

1.2 Overview

The research reported here strives to understand the role of intuition in the design of novel human-robot interfaces, where interface is the gate in between a piece of technology and its designated user. That can be summarised in two questions. What does the word intuition mean and what does it imply between engineers, users and technology? Can intuition be “engineered”?


2 Defining Intuitive Interaction

“You please me much more by being moved by sculpture than by painting – all the more so since Theo assures me that you also have a good eye for paintings. Naturally, that couldn’t yet be a settled taste, which will never waver, but intuition, instinct, is already a great deal, and precisely what everyone doesn’t always have.” [7]

Vincent Van Gogh, writing to his sister Willemien, gives us the poetic connotation of intuition that has widely been associated with artistic “gut”. However, there is an element of pre-destination in his words that is misleading. Intuition, hence not to be confused with instinct, constitutes the way we try to make sense of the world around us. Looking at the etymology of the word, intuition comes from the Greek νοησις [nóêsis], that consists of the operation of νους [nous] without benefit of the discursive reasoning that characterizes διανοια [diánoia]. [8]

In Plato’s doctrine, the act of intuition means becoming conscious of pre-existing knowledge and it is referred to as anamnesis [9]. In modern psychology Carl Jung defines intuition as an “irrational function” and stresses out the complete unintentionality of the process [10].

Shifting from humanities to HCI (human-computer interaction), many principles remain relevant to graphical, tangible and gestural controllers. In the technical realm, intuition is associated with ease of use and involves knowledge spanning from human-factors to ergonomics. Blacker et al. [11] state that an intuitive interface design involves using knowledge that the users gain through the experience from other products.  A commercial example of that is represented by the launch and success of the Apple iPad in 2010. Tablets were not a novelty in the world of consumer electronics. Microsoft had launched its own tablet PC almost ten years earlier, encountering a remarkable fiasco. Steve Jobs made the iPad a success because of a marvellous marketing campaign. However, from a usability point of view, the fact that the iPhone had been released few years earlier made it so that users were already accustomed with the touch-interface. In fact, people associated with the new tablet stating that it functioned like “a larger iPhone” [12]. Again Blecker et al. outline a set of guidelines for HCI who aim to create intuitive interfaces [11]. Paraphrasing their words, here are the three principles for designers to follow:

.1 If a function is well known, recur to pre-existing symbols, shapes and icons;
.2 If a function is completely new, then use metaphors and relate it to something that already exists;
.3 The appearance and location of the same features should be consistent throughout each part of the design;

Researchers at Eindhoven University of Technology [13] introduced a dualistic consideration of interaction: semantic-based or direct-based. In their own words “In the semantic approach, the appearance of the product and its controls become signs, communicating their meaning through reference. Products resulting from this approach–be it hardware or software–often use control panels labelled with icons or may even be icons in themselves.” [13] On the side of that, the direct approach “takes behaviour and action as its starting point. Here, the basic idea is that meaning is created in the interaction. Affordances only have relevance in relation to what we can perceive and what we can do with our body: our effectivities.” [13]  It becomes relevant at this point to describe the term “affordance”. In Norman [14] words, an affordance is the relationship in between the appearance of an object/controller and the action of its potential user. For instance, a chair affords seating, but when its lightweight it also affords lifting and perhaps throwing.

As for how much of a role intuition plays in approach to new technologies, arguably intuition should not become a sine qua non condition for novel interface design. In fact, with this article I am not advocating that intuitive and usable interfaces are the only way to pursue interface design. When radically new technologies emerge together with new rituals, the HCI community has the necessity and the freedom of designing new paradigms. That should then be supported by guidelines and a structured learning process. An example can be traced back to the release of the first commercial mousepads. For how hilarious and anachronistic it might sound today, Apple released a set of tutorials on floppy disks on how to operate its computer mouse, back in 1984 [15]. On the negative note, researchers have also pointed out that using the same features can lead to lack of innovation. In particular, repetition can result a huge production of slightly diversified sameness [16].

To conclude it can be argued that ease of use has long been a key concern in CHI scientific community. However, doubts resonate. If intuition is largely based on pre-existing knowledge, should designers only attain to the existing? The next chapter will showcase four prototypes that result from an intuition based-design approach.


 

3 Four Prototypes of Playful and Naive Explorations

“To put the matter differently, “play” (and its associated behavioural variability) is not purely entertainment or a luxury to be given up when things get serious. It is itself a highly adaptive mechanism for dealing with the reality that the context for behaviour is always largely unknown” [17]

Intuition does not solely play a role in the way a user approaches a designed object. To a large extent, intuition is also the natural way in which a designer jumps into a project brief. In this chapter, four pieces of personal work will be described, within the theme of human-robot companionship. As an initial study, four function were mapped out to four behaviours, in order to develop a set of working prototypes and therefore acquire some generic insight. The four pairs are:

.1 Hitting: a study on pain and shared empathy;
.2 Stroking: a study on intimacy;
.3 Following: a study on obedience and servile attitude;
.4 Blowing: a study on unpredictable and irreverent behaviour;

It can be argued that this approach is little structured and largely reliant on play as a vector for serendipitous discoveries. In fact, the prototypes were part of a rather free and unstructured design process. As a designer, I used my own intuition to explore interaction paradigms through making. At the end of this chapter, for each one of the prototypes, an informal evaluation will be reported.

3.1 First Interaction study

The first robot investigates the territory of pain and shared empathy. The underlying assumption is that if a robot can simulate pain, it does actually allow for mistreatment. Three stakeholders interact in a cycle. R1, the actuator stick acts on H, the human agent, that intervenes on R2, the sensor stick. The interaction unfolds as shown with the two following images.

Figure 1. Overview

Figure 1. Overview

Figure 3. Both the sticks are initially at rest, until when a third agent, the human hand, interacts with the sensor. The stick affords twisting. If twisted gently, no reaction is scheduled

Figure 3. Both the sticks are initially at rest, until when a third agent, the human hand, interacts with the sensor. The stick affords twisting. If twisted gently, no reaction is scheduled

Figure 3. If the stick is agitated with violence, the second agent will break into the scene with a defensive attitude towards its peer

Figure 3. If the stick is agitated with violence, the second agent will break into the scene with a defensive attitude towards its peer

 

3.2 Second Interaction study

The second prototype is a study on intimacy through touch. The scenario comprehends a robot, where R1 is the actuated stick and R2 is the pressure pad sensor, and a human agent H.

Figure 5. Overview

Figure 5. Overview

Figure 6. The white pad affords for being touched and pressed, given its soft material nature. When sufficient pressure is applied on the pad, R1 moves towards the hand

Figure 6. The white pad affords for being touched and pressed, given its soft material nature. When sufficient pressure is applied on the pad, R1 moves towards the hand

Figure 7. According to the velocity and intensity applied, R1 responds with a mirrored behaviour

Figure 7. According to the velocity and intensity applied, R1 responds with a mirrored behaviour

 

3.3 Third Interaction study

The third prototype focuses on wild behaviour and obedience. In this case, touch is only partially suggested, but it never actually occurs. Without human intervention, the robot creates rapid and unpredictable movements. When the human is recognised by the sensor, the stick follows the hand quietly. R1 is a stepper motor that moves the stick on a rail. R2 is a proximity sensor and finally H is the human agent.

Figure 8. Overview

Figure 8. Overview

Figure 9. R1 moves around in space to random positions with rapid movements

Figure 9. R1 moves around in space to random positions with rapid movements

Figure 10. When H is in front of the sensor, R1 stops. The stick follows the hand while it is in range of R2. It keeps the distance in between R2 and H constant

Figure 10. When H is in front of the sensor, R1 stops. The stick follows the hand while it is in range of R2. It keeps the distance in between R2 and H constant

 

3.4 Fourth Interaction study

The last prototype focuses on joyful and silly behaviours. While initially the response of the robot is coherent, later it becomes very unpredictable in a playful way. R2 is the sensor that detects how hard a person blows air. R1 is the actuator, that is a computer fan, and H is the human agent. The interaction modality is totally intangible, yet the intensity of the reaction is directly proportional to the pressure applied on the sensor. However, the scheme is broken by random behaviours of R1.

Luca_17

Figure 12. When H blows at R2, R1 responds with an equal reaction in the opposite direction

Figure 12. When H blows at R2, R1 responds with an equal reaction in the opposite direction

Figure 13. At some random moment, R1 will blow toward H in a very unpredictable and irreverent way

Figure 13. At some random moment, R1 will blow toward H in a very unpredictable and irreverent way

 

Looking back at the whole set of prototypes, the playfulness and naivete can be found as consistent features. From an interaction point of view, studies #1,2 are based on touch and mimicking human rituals. Interaction studies #3,4 relate instead to gestures and involve an intangible interaction. Purposefully, only tangible interfaces will be considered from now on, leaving gestural interfaces out of this investigation. Next to the need of narrowing down the scope of this research, arguments can be moved against the widespread of this fairly novel modality in recent years. Norman and Nielsen [18] define the introduction of gestural controllers as a step back in usability and they mainly refer to their lack of consistency, difficulty in understanding new operations and the easiness of triggering unwanted reactions. The exercise of building behavioural-based prototypes manifests my prior knowledge of interactive devices, in the several ways it reached me. Even though it is difficult at this moment to map these prototypes to specific real world examples or fictional characters and controllers, few elements definitively resonate and appear as embodied knowledge. The next chapter deals with science fiction and hopefully few lights will start blinking.


 

4 The Cultural Heritage of Science Fiction in Designing Human-Robot Interaction

“If you don’t have a sense of wonder it’s like you’re dead inside,” he says. “But your sense of wonder can be used to trick you. You can have a sense of wonder over a thing that’s basically a conjurer’s trick, or a con job, or a rip-off.” [19]

If intuition is largely based on pre-existing knowledge, then science fiction is arguably a basin of icons we are at least all partially exposed to. This chapter will explore how science fiction, here considered a pop culture vehicle, shapes and influences our relationship and opinion towards robots. Before proceeding with the discussion, it is necessary to define which section of the whole science fiction genre will be taken into account. For the purpose of this discussion, only three mediae will be considered: motion pictures, animated movies and novels. As far as topics are concerned, the following analysis will be limited to a selection of works which relates to non-humanoid companion robots and touch-based controllers. Physical is a fil rouge between the example hereafter presented. The choice of looking at mechanical controllers as well as human-robot interaction intends to bridge the two inspirational archetypes. On one hand, looking at the way in which robots interact with humans in science fiction can inspire the design of behaviour-based interactions, where reciprocity of interaction is a must. On the other hand, looking at physical controllers can provide insights on more direct-based interaction paradigms.

4.1 Science Fiction Meets Interaction Design

Science fiction is hardly ever solely about technology. Possibly, it is a hybrid terrain where humanities and social studies can meet with engineering. N.Shedroff and C.Noessel published a compendium book [20] that analyses dozens of science fiction movies’ interfaces and gadgets to create correlations with existing technologies. The authors believe that the reason why science fiction can be deeply inspirational for engineers lies in the fact that it only possesses narrative constraints [20]. However, mixing technological development with narrative can lead to challenging scenarios. When researchers pick up ideas from science fiction, they might set the bar too high for the technical challenge. Dourish et al. [21] argues that the audience might end up frustrated when a commercial product does not meet the expectations evoked by popular fictions. Perhaps a more shared vision can solve possible deadlocks.

4.2 Examples in Literature

Literature was surely the birthplace of modern science fiction, which saw boasted after the second World War, leaving with a bitter-sweet feeling of excitement and fear. Amongst many, two authors are definitively worth mentioning: the British Arthur C. Clarke and the American Isaac Asimov. The vast productions of the two authors encompassed the paper and contaminated cinema as well.

2001: A SPACE ODYSSEY

The most spectacular and notorious work of Sir Arthur C. Clarke is definitively 2001: A Space Odyssey, written back in 1968 and based on the four hands script written together with Stanley Kubrick. The main antagonist in the plot is HAL 9000, an artificial intelligence computer that monitors the commands for the space ship on the mission Discovery One. Hardware-wise, HAL resembles a contemporary computer with a sophisticated voice interface. Despite being designed to assist and serve its human counterparts, HAL malfunctions. Eventually, the astronaut Bowman finds it appropriate to disconnect the machine. The physical interaction between the two, in the moment of HAL’s deaths is described as followed by the author: “He (Bowman) released the locking bar on the section labelled “cognitive feedback” and pulled out the first memory block. The marvellously complex three-dimensional network, which could lie comfortably in a man’s hand yet contained millions of elements, floated away across the vault. “Hey Dave” said Hal. “What are you doing?” I wonder if he can feel pain? Bowman thought briefly. Probably not, he told himself.” [22] Again, the conversation continues with HAL singing a song to Bowman, while realising that he is vanishing forever. “Daisy, Daisy, give me your answer, do. I am half crazy, all for the love of you.” The voice stop so suddenly that Bowman froze for a moment, his hand still grasping one of the memory blocks still in circuit… Bowman could bear no more. He jerked out the last unit, and HAL was silent forever.” [22] This novel covers few different topics that are of interest to this research. In the first place, Arthur C. Clarke touches upon the theme of disobedience. HAL suddenly puts its own survival before the orders of its master. That brings to the second interesting aspect: the AI system possesses a real body and can experience a form of physical pain, as it can be understood from its last words.

I, ROBOT

Isaac Asimov is known for his huge production of writing, ranging from science fiction to actual science. Amongst thousands of lines of work, he has probably been best cited for the story “Runaround”, which is part of the book, “I, Robot”. In there he first outlines the so-called “three laws of robotics”, that define the robotic companions as an appendix of human activities and safety. The laws can be summarised as:

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm;

2. A robot must obey orders given by human beings except where such orders would conflict with the First Law;

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law;

This very human-centred vision of early robotics culminates with the description collected below, where the protagonists of the story need to “ride” a robot that elsewhere would not depart. “Powel stared and then pinched his moustache. ‘Uh… Oh!’ Donovan’s eyes bulged. ‘We’ve got to ride him? Like a horse?’ ‘I guess that’s the idea. I don’t know why, though.’ […] He placed a foot in the improvised stirrup and swung upward. He found the seat comfortable; there was the humped back of the robot, evidently shaped for the purpose, a shallow groove along each shoulder for the thighs and two elongated ears whose purpose now seemed obvious. Powell seized the ears and twisted the head. His mount did not feel at all light-hearted.” [23] 18 These two passages manifest the human supremacy over the “race” of robots. What is peculiar about it is, once again, the physical nature of the interaction. The dominance is not only manifested through words, but it possesses a bodily dimension. After all, most of the robotic companions are controlled through a series of verbal commands or a button interface.

4.3 Examples in Cinema

Figure 14. Filmmakers and technologists collaborating towards the creation of compelling new visions

Figure 14. Filmmakers and technologists collaborating towards the creation of compelling new visions

As early as in the 1910s’, science-fiction screenplays began to appear in the emerging cinema industry’s attention. Amongst many examples, a movie in particular showed a relevant collaboration between a picture director and the HCI community. Minority Report [24] shows the tight collaboration of its production designer and the tech community of MIT Media Lab. The directing team of the movie saw clear benefits of including technologies that would be commercially implemented in the near future [25]. Schmitz et al. [25] present a schematic that explains, in the eyes of the authors, the ties between the world of sci-fi writers and technologists (Figure 14). How reciprocal the contamination between technology and cinema should be emerges from their vision. Interestingly, the final desirable outcome is a shared vision that bridges the two disciplines.

Figure 15. DUM droids mostly work as playful teams

Figure 15. DUM droids mostly work as playful teams

DUM-SERIES (STAR WARS SAGA) A vast body of inspiration was found in the saga Star Wars. Speaking in general terms, it can be said that most of the robots presented in Star Wars do follow the very same human-centred approach introduced by the three laws of robotics of Asimov. The first droid of relevance is called DUM [26]. It serves as a mechanic fixer in several movies and contexts (Figure 15). Most of the time, DUM behaves in a very silly manner, pranking with its colleagues, with which the physical contact is largely common. When it comes to interacting with the humans, DUM droids retract to a compact size once they are hit on their heads. All of that happens in a very hilarious way, mainly due to the fact that their joints are particularly springy.

Figure 16. Chopper, the fun companion robot in Star Wars Rebels. Close-up on his interface

Figure 16. Chopper, the fun companion robot in Star Wars Rebels. Close-up on his interface

C1-10P (STAR WARS REBELS) As part of the animated movie series Star Wars Rebels [27], Chopper is a three legged companion droid (Figure 16). However, Chopper has a very sharp character which makes him very self-centred and rude at times. Interaction with him is always unpredictable, especially because chopper, as its fans call him, tends to run away from physical contact.

Figure 17. The phone interface in Aliens

Figure 17. The phone interface in Aliens

PHONE MACHINE (ALIENS 1986) In the movie Aliens [28], Ripley accesses a very curious phone interface (Figure 17), definitively revolutionary for its subversion of paradigms. Shifting away from the idea that in order to make a call we need to remember a lot of cyphers, the director shows how a business card can become a unique identifier. Once the card is slid into the allocated slot, the call is placed, without a need to press any buttons. Similarly, the call ends when the card is removed.

Figure 18. Overview of the command cockpit of the Enterprise, where the manual steering column is located

Figure 18. Overview of the command cockpit of the Enterprise, where the manual steering column is located

ENTERPRISE (STAR TREK: RESURRECTION) Star Trek has also shown off a vast collection of human-computer interaction prototypes. Most remarkably, the spaceship Enterprise gave stage designers an opportunity to exercise their creativity, combining screen-based with mechanical controllers. In particular, they show how easier it is to regulate speed with a lever than it would be with a touch screen interface: holding in a position on a trackpad would be cumbersome. In Star Trek: Insurrection [29], the use of the “manual steering column,” which is very close to a joystick, is memorable since, previous episodes used huge screens and button interfaces (Figure 18).

Even though the selection contains a relatively small pool of examples, a comparison between the novels and the movie can be made. On the one hand, text-based science fiction can trigger a variety of novel content, given that pure narrative has a lot of freedom. As an inspiration source, novels can also leave enough freedom for imagining how things actually look like, which can be favourable to some extent. On the other hand, talking about movies, narration leaves more space for theatrical aspects, such as movements and special effects. The next chapter will show the effort of synthesising few elements described here into prototyping controllers.


5 Design Case: a Set of Four Robotic Controllers for a UR10 Robotic Arm

“To make a violin, take the wood and carve away all that is not the violin.” [30]

In order to test both the empirical knowledge gained through the making of the initial prototypes and the literature review of science fiction examples, I outlined a design case that could give a tangible context for the future interaction design studies. The use of a robotic arm can combine the qualities of robotics through the expression of movements with the controllers. Defining what is a work of art goes beyond the scope of this dissertation, while talking about design agency is precisely on spot. Industrial CNC machines such as 3D-printers, laser-cutters and robotic arms have infiltrated design and architecture studios for quite some years now. The more CAM (Computer Aided Manufacturing) tools populate our desks, the more professionals get confined to a making process, which consist of solely preparing computer files. The designer is totally detached from the actual making process, whereas in the old days, handling the tools gave the designer more control throughout the process. As Maeda [31] points out in the introduction to his book ‘‘Design by numbers”, ”current creative programs exploit a very narrow range of motor skills.” Is it possible to bring back intuitive and expressive qualities in controllers of modern times? Potential user of this application would be artists, designers, architects who are open for an exploratory and performative making process. Also, users should consider unpredictability and lack of control as meaningful aspects as opposed to unpleasant and unforeseen. In order for the robotic arm to paint on a canvas, a list of four functions was outlined, to be performed through four different tangible actions. Separating rather than integrating seemed to be the best option, as it could allow for studying each interaction functionaction coupling specifically. 1. The function of switching on was mapped to the action of grabbing; 2. The function of moving in cartesian space was matched with the action of manoeuvring; 3. The function of regulating the speed was mapped to adding and removing blocks; 4. Finally, the function of recording and playing back, conveniently squeezed in the same controller, were mapped to adding and removing blocks; The prototypes are reported below, in combination with the computer simulation that ran to study the effectively perceived behaviour

5.1 First Robot Controller

Directly inspired by the story “Runaround” of Isaac Asimov, as reported in the previous chapter, the first controller is a gestural switch (Figure 19). In order to activate the robot, the user is asked to perform a move of physical dominance and abandonment. This controller brings forward the 70-years-old message that machines should be under the mastery of humans at all times, at all costs.

Figure 19. The user can place his hands in the assigned slots and grab firmly the metal sticks, which function as capacitive sensors. When the hands are released, the robotic arm will switch off. Eventually, the robot can never be unattended

Figure 19. The user can place his hands in the assigned slots and grab firmly the metal sticks, which function as capacitive sensors. When the hands are released, the robotic arm will switch off. Eventually, the robot can never be unattended

 

5.2 Second Robot Controller

The robotic arm moves in a very expressive way, especially considering that it possesses six axes of rotation. Therefore, inspiration for the design of this controller came from a suggestive character: the DUM-series pit droids. Despite lacking of a proper interface, it can be argued that its springy head actually works as one. Translating that into a joysticklike controller, two interactions’ modalities were implemented (Figure 20). On one hand, moving the “head” around would move the end effector in space. On the other hand, the springs would generate unexpected reactions and, much alike its inspirational character, it might seem silly.

Figure 20. The controller works pretty much like a joystick, allowing for a two-axis control and a third axis mastery through pressing. If the hand is released at any moment, the springs will cause the controller to move in a silly way

Figure 20. The controller works pretty much like a joystick, allowing for a two-axis control and a third axis mastery through pressing. If the hand is released at any moment, the springs will cause the controller to move in a silly way

 

5.3 Third Robot Controller

The speed controller picks on the dramatic and intense relationship of Bowman and HAL 9000 at the end of 2001: A Space Odissey. Metaphorically, a grid of blocks represents the vigour of the robotic arm, that can be faded through the gesture of removing parts of it (Figure 21). The robot does not only become slower or faster in its performing speed, but it would also manifest a hectic opposed to a lazy behaviour.

Figure 21. The grid of 3x3 blocks gives a constant visual feedback on which percentage of the total speed is being run. Removing one block corresponds by lowering the velocity of approximately 10% of the total

Figure 21. The grid of 3×3 blocks gives a constant visual feedback on which percentage of the total speed is being run. Removing one block corresponds by lowering the velocity of approximately 10% of the total

 

5.4 Fourth Robot Controller

The fourth controller takes inspiration from CD/cassettes players in general and the phone interface in Aliens in particular. From there, the idea that no buttons and semantic graphics are required on top of the interaction seemed worth pursuing. The user is invited to use the tokens as memory cards. Based on where they are located, the system is either recording a session or playing it back. It is proudly button-less.

Figure 22. The interface is composed of two slots and multiple cassettes that can be inserted and removed. While the cassette is located in the upper slot, the device is recording the performed movements. Otherwise, when the cassette is place within the lower slot, the interface replays the movement. Combination of the two actions would ideally not be allowed

Figure 22. The interface is composed of two slots and multiple cassettes that can be inserted and removed. While the cassette is located in the upper slot, the device is recording the performed movements. Otherwise, when the cassette is place within the lower slot, the interface replays the movement. Combination of the two actions would
ideally not be allowed

 

Below is the image of an example of the simulation that was run (Figure 23).

Figure 23. The simulation set-up includes the controller and a custom made laptop interface programmed in Processing, where the robotic arm simulation is run

Figure 23. The simulation set-up includes the controller and a custom made laptop interface programmed in Processing, where the robotic arm simulation is run

Figure 24. Overview of the room where the usability test was run

Figure 24. Overview of the room where the usability test was run

 


 

6 Preliminary User Test

The purpose of this study is to gather preliminary knowledge on the controllers and identify any usability issues so that they can be addressed as part of an iterative design process. The study was aimed at gathering both qualitative and quantitative data. Amongst a wide choice of possible testing methods, an heuristic evaluation seemed the most useful. In usability engineering, such evaluation is preferred over user testing when the assessed items are in the early prototyping stage. Heuristics are general principles of interaction design on which the computer-human interaction community agrees upon. Nielsen [32] outlines ten principles, which are arguably the most used. A set of four statements was derived from the principles. For each item, the interviewee was invited to grade in a scale from 1 to 7 (where 1 = completely disagree and 7 = completely agree). In full, the questionnaire contained the following statements. 1. Visibility of system status-The interface keeps me informed on what is going on, through appropriate feedback within reasonable time. 2. Error prevention and recover-The interface design prevents me from making mistakes. 3. Recognition rather than recall-The sequence of actions is clear and understandable and enough information are provided when necessary. 4. Aesthetic and minimalist design-The interface does not provide irrelevant information, which might diminish the visibility of relevant parts of the dialogue. Next to that, users were also encouraged to give a qualitative comment on each one of the interface.

6.1 Methodology

For the purpose of the study six participants aged 20 to 30 years old were selected. The only restriction for participants is that they must be right-handed. The prototypes were presented altogether on the table (Figure 24). For each one of them, the following protocol was run:

1. Researcher presents the prototype function;

2. Evaluator is free to explore the interface for about one minute;

3. Evaluator fills the heuristic template;

4. Evaluator writes down further comments that might come up;

6.2 Results and Re-design

The first controller raised several concerns about the two pipes sticking out in the middle. Evaluators thought they were misleading in the sense that people might want to use them as joysticks. Furthermore, the function of switching on is mostly associated with the action of pressing a button or moving a lever. According to the evaluators, this scenario is missing a mechanical feedback to the user. The quantitative data are shown in the next page (Figure 25) and outline a quite negatively perceived interaction modality.

Figure 25. Results obtained from robot controller #1

Figure 25. Results obtained from robot controller #1

 

Clearly, this interface would require a redesign. The second iteration exhibits the use of one hand only and the removal of the confusing stick. Picture is shown below (Figure 26).

Figure 26. Redesign of robot controller #1

Figure 26. Redesign of robot controller #1

 

The second controller performed intuitively, even though there were several remarks. In the first place, evaluators expected to be able to rotate the pad. In fact, if the movement of the horizontal axis is clear, the vertical axis is not well implemented and perhaps rotation could control it. Lastly, a remark was made towards the noise produced by the springs, which is apparently distracting. Looking at the quantitative data shown in the chart below (Figure 27), it was agreed that the performance was sufficient and no re-design was necessary.

Figure 27. Results obtained from robot controller #2

Figure 27. Results obtained from robot controller #2

 

The third controller is clearly what evaluators described as most counter-intuitive. To begin with, the grid layout was not mentally associated with speed, as the choice of using nine blocks seemed very arbitrary. Besides, we are mostly used to thinking of velocity as a continuous force, as opposed to discrete. The graph below shows how all these factors resulted in a mediocre quantitative performance (Figure 28).

Figure 28. Results obtained from robot controller #3

Figure 28. Results obtained from robot controller #3

 

The redesign process aimed at making the physical visualisation more linear. In fact, four blocks of increasing heights were coupled with four speed stages. Picture is shown below. (Figure 29).

Figure 29. Redesign of robot controller #3

Figure 29. Redesign of robot controller #3

 

The fourth controller overall scored well (Figure 30). The cassette metaphor was clear and had a “vintage” taste to it that the evaluators appreciated. The fact that the interface was built on two levels made it clear that the two slots had a different meaning, even though the difference between recording and playing back was arbitrary. After the data were analysed, a redesign did not seem vitally important.

Figure 30. Results obtained from robot controller #4

Figure 30. Results obtained from robot controller #4

 

Overall, two controllers were redesigned. Looking back at the evolutions of the two, it is clear how the final prototypes are distant from the original sci-fi references. This can lead to questioning how many steps are necessary in order for each interface to not be reconnected to the original work. How much of the original science fiction is necessary for the creation of an intuitive association? And again, is it necessary at this point to have these features so explicit?

6.3 Setup Proposal

To further make the observations tangible, the final set up of the controllers and robotic arm is introduced. The controllers, located in front of the robot, were positioned, from left to right, in the following order: 3, 4, 2, 1 (Figure 31).

Figure 31. The set-up of the room, where the robotic arm sits right in front of the controllers

Figure 31. The set-up of the room, where the robotic arm sits right in front of the controllers

 

The results are shown in the pictures next page (Figure 32, 33) and see a large deploy of wizard of Oz solutions.

Figure 32. The UR10 robotic arm painting an abstract drawing

Figure 32. The UR10 robotic arm painting an abstract drawing

Figure 33. Close-up on the custom-made end effector with a brush

Figure 33. Close-up on the custom-made end effector with a brush

 

However, few considerations can be made upon the set of controllers, as they were being used by an actor. The strategy of studying each function with a separate controller showed its limitation when the interfaces were put altogether. In fact, it turned out that the sequence of rituals and actions is poor and the user can dedicate his attention to one controller at a time only. Further design work could include a combination of controllers, still preserving the same amount of functionalities. For example, the speed interface can become obsolete if the joystick could detect the velocity at which it is manoeuvred. With the next and final chapter, a full reflection on intuitive interaction design will be outlined.


 

7 Discussion and Conclusion

“Mental phenomena have real, as well as intentional, existence. Empirically we might be satisfied with a structure as a purely physical-spatial entity but, intellectually and spiritually, we need to understand the motivation behind it. This duality of intention and phenomena is like the interplay between objective and subjective or, more simply, thought and feeling.” [33]

This dissertation investigated the role of intuition in human-robot interaction and it questioned the duty of interaction designers. During the process, intuition recurred in two separate stages. In the first place, intuition served as design thinking method to approach a new brief and led to four prototypes. Later, it was framed through a small yet significant set of pop-culture examples. Starting from those a set of controllers was developed to fit in an arbitrarily chosen context. The scenario has provided multiple insights on the initial questions. In the first place, the fact that each pair of function to action was embodied in a separate controller, allows for an individual discussion. On the negative side, though, it emerged from the usage that the sequence of action is a relevant aspect that allows for creating a compelling and satisfying interaction. The project saw the combination of two strategies: from divergent explorations at the beginning to convergence on a design case at the end. Such a blend proved to be successful since it allowed for the production of several insightful prototypes. In support of the thirst for intuitive interfaces, this dissertation shows how sometimes similar products or interface resonate in our memory and leave us with an easy access way to technology. Reference features might be very explicit, as shown in the design of robot controllers #3,4 or might go through a more thorough layer of abstraction, such as the rest of the controllers show. Following this path, it might be interesting to consider whether it would be possible or not to “engineer” intuition. That could imply, but not be limited to, preparing the audience about a new radical technology when futuristic media introduce it. Design fiction and speculative art might be more suitable environments for such explorations. It can be argued that even though plenty of meaningful insights emerged while reviewing science fiction examples. In fact, design should not limit itself to re-interpreting and readapting existing paradigms. On the first place, If interface design only comes from the technical world of engineering and human factors, there is a risk of getting stuck in existing methodologies and beliefs. Intuitive interfaces, designed and evaluated through usability principles only leave out a whole world of experimentation and naivete that is equally valuable. All in all, intuition is not to be considered as gold perhaps. Lacking to consider ergonomics and cognitive factors in products can free designers’ creativity. Intuition should maybe leave some space to spontaneity.


 

Bibliography

[1] Credited to Joseph Engelberger, pioneer in industrial robotics, ca. 1966

[2] Turi, Jon. “Tesla’s Toy Boat: A Drone before Its Time.” Engadget. N.p., 14 July 2016. Web. 25 Apr. 2017

[3] Capek, Karel. R.U.R.: Rossum’s Universal Robot. N.p.: Dover Thrift Edition, 2001. Print

[4] Hello, Robot. Design between Human and Machine, exhibition catalogue, 11 Feb – 14 May 2017 Vitra Design Museum

[5] Robot Institute of America, 1979

[6] D. Schön, “The Reflective Practitioner: How professionals think in action”. London: Temple Smith, 1983

[7] Vincent van Gogh, Letter to: Willemien van Gogh, Arles, Sunday, 26 August 1888

[8] Philosophical Dictionary: Nicole-Nussbaum. N.p., n.d. Web. 13 July 2017

[9] Plato, Phaedo 73c-75c

[10] C.G. Jung. Psychological Types. Bollingen Series XX, Volume 6, Princeton University Press, 1971

[11] Blacker, Alethea L and Popovic, Vesna and Mahar, Douglas P (2005) Intuitive Interaction Applied to Interface Design. In Proceedings International Design Congress – IASDR 2005, Douliou, Taiwan

[12] Buchanan, Gizmodo.com Matt. “How Steve Jobs Made the IPad Succeed When All Other Tablets Failed.” Wired. Conde Nast, 03 June 2017. Web. 13 July 2017.

[13] Djajadiningrat JP, Wensveen SAG, Frens JW, Overbeeke CJ (2004) Tangible products: redressing the balance between appearance and action. Special issue on tangible interaction of the J Personal Ubiquit Comput 8:294—309

[14] Norman DA (1999) Affordances, conventions and design. Interactions 6(3):38—43

[15] “Macintosh 128K – Mashpedia Free Video Encyclopedia.” Mashpedia. N.p., n.d. Web. 13 July 2017. .

[16] Raskin, J., 1994, Intuitive Equals Familiar, Communications of the ACM, 37(9), pp. 17-18.

[17] Grobstein, P. (1994) Variability in Brain Function and Behavior, published in The Encyclopedia of Human Behavior, Volume 4 (V.S. Ramachandran, editor), Academic Press, 1994 (pp 447-458)

[18] Gestural Interfaces: A Step Backwards In Usability. N.p., n.d. Web. 13 July 2017.

[19] Galaxy, Geek’s Guide to the. “Here’s What Sci-Fi Can Teach Us About Fascism.” Wired. Conde Nast, 14 Jan. 2017. Web. 13 July 2017.

[20] Shedroff, Nathan. Make It So: Interaction Design Lessons from Science Fiction. New York: Rosenfeld Media, 2012. Print

[21] Dourish, P., Bell, G. “Resistance Is Futile”: Reading Science Fiction alongside Ubiquitous Computing.” Personal and Ubiquitous Computing 18.4 (2013): 769-78

[22] Clarke, Arthur C. 2001: Space Odyssey. New York: New American Library, 2000 edition, pp. 201-203. Print

[23] Asimov, Isaac. I, Robot. London: Dobson, 1950. pp. 36-37. Print

[24] Minority Report. Dir. Steven Spielberg. 20th Century Fox, 2002

[25] Schmitz, M., Endres, C., & Butz, A. (2008). A survey of human-computer interaction design in science fiction movies. In Proceedings of the 2nd international conference on INtelligent TEchnologies for interactive entertainment, ICST, 1-10

[26] Star Wars Trilogy. Dir. George Lucas. 20th Century Fox Home Entertainment, 1977-1983

[27] Star Wars Rebels. Dir. Simon Kinberg, Dave Filoni, Carrie Beck. Lucasfilm Animation, 2014

[28] Aliens. Dir. James Cameron, 20th Century Fox, 1986

[29] Star Trek. Dir. Jonathan Frakes, Paramount Pictures, 1998

[30] Attributed to Stradivari, reported by David W. Nicholson, Philosophy of Education in Action, Routledge, pp.48

[31] Maeda J (1999) Design by numbers. MIT Press, Cambridge

[32] Nielsen, Jakob (1994). Usability Engineering. San Diego: Academic Press. pp. 115—148. ISBN 0-12-518406-9

[33] Holl, St., Pallasmaa, J., Pérez-Gómez, A. 1994. Questions of Perception. Special Issue. Publishing Co., Ltd.: Tokyo

List of Figures

Figure 1-13. Luca Giacolini, 2017

Figure 14. Schmitz, M., Endres, C., & Butz, A. (2008). A survey of human-computer interaction design in science fiction movies. In Proceedings of the 2nd international conference on INtelligent TEchnologies for interactive entertainment, ICST, 1-10

Figure 15. “Juillet 2016.” Hyperdrive. N.p., n.d. Web. 04 May 2017. https://unfandestarwars.com/2016/07/

Figure 16. Stawarsrebels. N.p., n.d. Web. 4 May 2017. http://starwarsrebels.wikia.com/wiki/File:Chopper%27sarms.png

Figure 17. Noessel, Christopher. Sci-fi Interfaces. N.p., n.d. Web. 14 July 2017.

Figure 18. “Posts about Patrick Stewart on Ryesofthegeek.” Ryesofthegeek. N.p., n.d. Web. 14 July 2017.

Figure 19-33. Luca Giacolini, 2017

Submit a Comment