Tuesday, 10 March 2015

Agobo The Hackable Raspberry Pi Robot - now with emotions





My wife gave me a 4Tronix AgoBo robot kit for Christmas (at my request). I built it a few weeks ago but didn't really have time to do anything with it.

The AgoBo is a Raspberry Pi A+ based robot kit. I also ordered the Plus plate that adds a Neopixel and lots of prototyping space on a board that mounts above the RPi. The kit is a really good affordable robot kit that can be customised very easily, especially with the PlusPlate. It is this customisation that really attracted me to the AgoBo in the first place.

When the robot arrived I thought that the Ultrasonic sensor looked like a pair of eyes but AgoBo was lacking a mouth. On another evening I was rooting through a box of electronic bits I bought for RPi projects and found an 8x8  LED matrix. 


I had seen plenty of robot that used these as eyes and thought that this could work. However with the robot being so small the matrix was far too large. I had another dig in the box and found a more suitably sized replacement.


The 5011AS display fitted just below the ultra sonic sensor with the pins above and below the main board. Aligned horizontally the segments could be used to make a smile or sad face by lighting the correct segments.

This idea was put on the back burner for a couple of weeks whilst normal life got in the way. and I kept thinking about how to mount the module effectively under the board. When I was able to experiment with the robot again (finally loaded the example software and tried out the supplied python scripts) I couldn't resist having a try with the mouth idea.

I haven't found time to solder the header on to the plus plate yet and wanted to get the mouth working so I grabbed a breadboard and some cables to try it out before I sorted it all out properly.



I had a ten cable female to female ribbon so I divided that into two (5 cables in each) to connect the ten pins of the display. With the ends of the cable connected there was very little room between the pins but with a little blue tack the display mounted nicely with two pins each side below the board and three above. To keep things tidy I separated the first part of the cable for a short length and then wrapped the cable up over RPi and under the PlusPlate (with a little Blue Tack of course).







I then grabbed a few resistors and connected the cables to the breadboard and then connected the other side to the header I fitted to the main board (in preparation for connecting the plus plate).


This is where I ran into my first problem.limited time and a failure to read instructions lead to an error in the connections. Instead of looking at the instructions I looked at the numbers on the top of the PlusPlate and reading down from the top started used the first available pins. Unfortunately these pins are already in use by AgoBo so there was a bit of a conflict when I tried to use these to run the mouth.

So looking back at the instructions I made a list of the pins that were in use and looked again at the PlusPlate for available pins and moved the connections to pins that were not already in use by AgoBo.

Once I had the connections all set up (correctly this time) I needed to set up- the code to run the mouth and control the facial expressions. I decided i wanted a smile (obviously, what's cuter than a smiling robot?) a sad face, a confused face and an open mouth, After this time consulting the instructions (the data sheet from Jameco) I drew a little diagram of which pin controlled which segments of the display and worked out a little table of which should be displayed for each facial expression.



With this organised I set up a Python library (mouth.py) to set up the facial expression and then a quick script to test the expressions. The test script (mouthtest.py) shows each expression I have set up so far. the smile, sad face and 'oh' i am really pleased with. I am not to sure about the confused face so I may not use that very much. These scripts will be available from my AgoBo Github fork here.


.



With the mouth working I wanted to work the expressions in to the normal running program for Agobo. I had written a quick script previously for him to avoid objects using the ultra sonic sensor so I used this as a starting point.

I ran into a small issue here as I had set up the mouth library using GPIO numbers and the AgoBo library is set up using board numbers. after a little head scratching (I am still unsure why in an error state the face always seems to display 'oh') I spotted the error and changed the mouth library to match the python library and now Agobo will avoid objects whilst displaying his emotions.

Currently he is happy moving forward until he finds an object. This makes him sad and he turns 90 degrees. If he can move forward he is happy again. If instead there is another object in his path he is shocked / cross ('oh') and turns 180 degrees. Again if the way he clear he is happy again and proceeds. However if there is another object he becomes confused (or his face does) and then turns 90 degrees (away from the initial object and proceeds on his way happy again.




Sunday, 8 February 2015

The Visual-Pi-ser a low cost classroom visualiser

In may last post I described my idea for a low cost visualiser based on the Raspberry Pi kit I was sent as part of the Element 14 Raspberry Pi Educators Road Test. I promised i would add more detail so here it is.



The box for the kit is used as a stand to hold the RPi case. The case in the kit has a mounting point for the camera and the arrangement holds the camera steady over the items to be shown. The main issue for this set up using the box is that the camera has a relatively large minimum focal distance so items at the range as out of focus until the camera is modified.

To modify the camera is a fixed focus unit and the lens is held in place with blobs of glue that can be removed with a craft knife to free up the lens (N.B. the lens can be sensitive to static and can be easily damaged so this needs to be done with care).


This modification allows the lens to be rotated and the items brought into focus. It also has the added advantage of slightly magnifying the subject.

The size of the box supplied is ideal for small physical computing projects but a larger box would give a larger field of view for bigger projects. The camera itself is controlled by using the Raspivid command:

raspivid -t 300000 -rot 90

This rotates the video by 90 degrees (as the camera mount is at 90 degrees to the box) and runs the video for 30000 miliseconds (30 seconds). If a longer or shorter time period is required then the number of milisecond can be changed. If you want to terminate the video session you can do so by using ctrl+c


I have created a guide on how to crate you own Visual-Pi-ser on github.


Wednesday, 21 January 2015

Low cost Raspberry Pi Visualiser

The kit I revived from takeing part in the Element 14 Raspberry Pi Educators Roadtest gave me another project idea.

After delivering CPD at Sheffield Hallam University i was very envious of their AV set up with a visulaiser and PC and RPi all connected to the screen. I can't make all this happen but with the RPi and camera I can make my own.

For work with the Raspberry Pi the box was even the right size to make a stand so i added a lighting solution (cheap torch from the garage and/or a clip on e reader light. Now for under £40 I had a visulaiser set up that i could use in the classroom.


A quick mock up above shows the basic idea but I'll post the full details here once it has been completed along with the python code to control the camera.

Monday, 19 January 2015

Element 14 Educators Road test

Over the holidays I have been experimenting with timelapse photography of crystal formation with the Raspberry Pi. i have been doing this as I have been selected as on of the participants in the Element 14 Raspberry Pi Educators Roadtest. If you haven't come across the Element 14 Roadtests before they are a scheme where kit is sent out to a selected group of volunteers to test and write about. They runs the tests fairly regularly and all you need to do to enter is to write a proposal of what you will do with the kit.

This particular roadtest is of the Raspberry Pi B+ Camera Kit and i wanted to come up with a proposal using the camera functionality in a way I could use the kit with students to teach Computing and Science. My proposal was to work with KS£ and Home educated students on two slightly different projects both based around taking timelapse photography of crystals forming.



We have been playing with crystals at home for a bit with my Home Educated son and this looked like a great project to try with the Raspberry Pi and Pi Camera Module. The Addition of the Wi-Pi adapter and the case with camera mount made it ideal for applications like this where connection to monitor and keyboard would be difficult and a secure mount for the camera essential.

Part of the deal is that you write at least three blog posts and a review of the kit on the Element 14 community website . I am still working on the review and the third blog post but the first two are up already:

Part 1: Introduction to the project

Part 2: Testing



EDIT: (09/03/15)

The Road test is now complete and I have some more blog entries and a review of the kit. I also ended up doing a couple of side projects and wrote a scheme of work for the Time Lapse Crystals project.

Part 3: Home Education Project

Part 4: Summary

Review

Scheme of Work

Side Project 1: Visual-Pi-ser

Side Project 2: Snow Timelapse


The Roadtest was a good opportunity to experiment with the RPi camera module and I enjoyed the chance to try something different. The time lapse photography was a good way to combine science and computing and also produce some beautiful results. I have included one of the example videos here. This one is probably my favourite, It was taken from below the crystals with lighting above as part of the home education project.


Friday, 19 December 2014

The first rule of computing club........

....................don't talk about computing club


It struck me today that my computing club is made up almost entirely of girls. It also occurred to me that I hadn't ever publicly called it computing club.

I started the group to work up some entries for the Sonic Pi space music competition. A small group of students (10) arrived the first week and I showed them how to set up the Raspberry Pi and gave them a quick introduction to the Sonic Pi interface. Before long they were producing basic tunes and adding loops.

The group has since grown to around 15 regularly attending students and they are all engaging in coding pieces of music for the competition. It was not until I was looking through the list that I realised quite how many girls I had ended up with. out of the 15 I only have 3 boys who regularly attend (compared with our scratch games club that is entirely boys).

I have worked before with groups of students using Sonic Pi and found that it is great for engaging all students (not mainly the boys like a lot of the robotics work I have done) but this was something different as none of the students had used Sonic Pi before.

This was a marketing issue!

I sent round the poster along with a note to all of the Y7-9 classes asking for anyone who wanted to try making music with the Raspberry Pi, no previous experience necessary. There was no mention of computing, coding, or programming.

The response was all from students who were interested in making music rather than those interested in programming. They now all know (not that it was a big secret really) that to make the music they are coding, but they are making music. This appears to be a difference in the approach to what they are doing and has affected how they are engaged with something new. They are not intrinsically interested in the method of making the computer do something, they are interested in the end result (in this case music).

I now have a predominantly female group of programmers all engaged in coding. Once we have finished working on the music competition I am going to be looking for ways to maintain this engagement using the output as the motivator and the coding as the 'what you have to do to get there' bit.

Wednesday, 19 November 2014

Raspberry Pi CPD in Sheffield


Part of the idea of Picademy was that delegates would go out and spread the word. So as part of my effort I spent this evening delivering Raspberry Pi CPD to the Sheffield CAS hub at Sheffield Hallam University.

The session was mainly made up of Computing ITT students from Sheffield Hallam University so it was really interesting to see a different approach to new information (from my secondary pupils). Most of the delegates had little or no exposure but there were two who had used the Pi for their own projects (a security camera and a remote media center).

After a quick introduction to the Pi we spent time describing how the Pi could be set up in classrooms and introduced the Raspberry Pi foundation's resources. Preparing for this event gave me the chance to look again at what is provided and the resources there really do give all the information that you need to get started using the Raspberry Pi and moving on to using it productively in the classroom.

After the set up and a brief summary of some of activities available for using the Pi in the classroom I spent some time focusing on some of my favorite schemes. as a parent of a 7 year old I can't avoid minecraft at home and have found that it is equally as engaging for secondary pupils. I shared some of Craig Richardson's resources from his blog on Minecraft Pi - Arghbox. the delegates were also given a chance to try out some of the scripts on the Pi's they had set up. This may have been a mistake with some of the more game obsessed ITT students (mainly male). This was hastily used to point out the importance of choosing classes and classroom management strategy carefully when using a game students are already familiar with.

We then looked at some of the other ideas I have used in the classroom. The use of Sonic Pi (paticularly as an application that appears to appeal more equally to both genders) to engage students creativity and teach programming in a fun way. We also looked at the possibilities using the GPIO pins for physical computing. I am very interested in 'Personally Meaningful Projects' as a key motivator for students to get involved in programming and the GPIO pins provide this possibility. I shared some example of projects my students and students from further afield have created using the Pi. We also discussed the support available from the community.

The great thing about the ITT students is that once they had a spark of an idea they appeared very enthusiastic to take this on and try using this in their teaching practices. Several were keen to borrow the university Raspberry Pi set and some were talking about purchasing their own and projects they could work on. Hopefully this talk will be converted to action and there will be a few more computing teachers in Sheffield schools enthused about the benefits of using Raspberry Pi in the classroom. If nothing else I did a little Picademy product placement and did my best (if not quite 'The Apprentice' level) pitch for the resources available on the Raspberry Pi site. I left the event feeling buzzy and motivated to do it again so it can't be all bad.

The Prezi I used to as a place holder for the introductory videos and some links for the resources we discussed is here.



As a side / end note this was a chance to play with some presentation tech I can only dream of in my classroom. I had a Pi on one button, the Prezi on another and a visualiser showing the actual Pi on a third. This was the first time I had used the set up at the university and I was very pleased with the possibilities. At the press of a swanky touch screen I could switch between the projected picture of my hands doing magic with the Pi and the actual out put of the Pi, then switch to the diagram on the Prezi showing the possible connections. This made the screen work hard for me and really helped to show what was going on. The only downside was managing multiple mice and a second keyboard a few paces away (due to the university padlocked setup) and talking at the same time. I don't imagine I will be getting this sort of system in my classroom anytime soon but it was good to try it out for an evening.

Monday, 8 September 2014

Conversations with computers using python


I had the idea of making the standard 'Hello World' introduction to programming a new language a little bit more interesting for my Y7 class.

The idea of the computer conversations and the recent Turing test success or (near success) by Eugene gave me the idea of getting the students to make a (vastly simplified) version of Eugene using Python. The plan would be to teach them some basic Python concepts like displaying information and filling variables based on user input and then maybe progress onto selection.

This would initially start as a very simple program with them entering their name and then including the name in the response. The students could then work up some complexity from there using more questions. The next step would be to switch things round and have the computer answer user question based on a list of pre-programmed responses.

The Plan

Introduction

Show the class a video by way of introduction to the test. Something like - Jeremy Clarkson Explains the Turing test or The Turing test, as described by Expect Labs CEO, Timothy Tuttle.

Explain the plan to create a basic chat bot that can have a basic conversation with the user.

Task 1- Hello World

Students to open IDLE and the a new window. They then create a basic 'Hello World' program and save and run.

e.g. 

print ('hello world!') 

Students to experiment changing hello world for whatever greeting they choose. (Yes they will probably make it say rude words!)

Task 2 - Talk to me

Obviously this is a pretty one boring conversation so we need to add in the ability for the user to input information.

e.g.
myName == input('What is your name?')
print ('hello ' + myName)

Students to experiment with this and then try adding more questions.

e.g.
myName == input('What is your name?')
print ('hello ' + myName)
myColour == input('What is your favourite colour?')
print ('That is amazing ' + myName + ', ' + myColour + ' is my favourite colour too')

Task 3 - Selection

To try and make the computer's responses a little more realistic it would be good if the response wasn't the same what ever you type so we can add add selection to change the response based on what is input.

This can start with a simple if else:

e.g.
myColour == input('What is your favourite colour?')
if myColour == 'orange':
    print ('That is amazing ' + myName + ', ' + myColour + ' is my favourite colour too')
else:
    print ('It is nice that you like ' + myColour + myName + ' I prefer orange')

This can then be moved on to add more choice using else if (elif):

e.g.
myColour == input('What is your favourite colour?')
if myColour == 'orange':
    print ('That is amazing ' + myName + ', ' + myColour + ' is my favourite colour too')
elif myColour == 'black':
    print (myName + ' your are strange ' + myColour + ' is not even a real colour, how can it be your favourite?')
else:
    print ('It is nice that you like ' + myColour + myName + ' I prefer orange')

Task 4 - Ask me a question

This basic idea can then be used to switch things around and let the students ask questions. This will only handle a pre-programmed list of questions and answers but completes the very basic conversation idea.

This can be added to the first code or used to start a new program. if in the same program some of the old variables can be used to add more interest.

First the computer needs to prompt the user to ask a question:

myQuestion == input('Ask me a Question?')
if myQuestion == 'how old are you':
    print ('I am 12, how old are you ' + myName + ' ?')
elif myQuestion == 'What is your name?':
    print ('My name is Simon')
else:
    print ('Sorry i didn't understand that question')

Students experiment with their own versions.


This is a fairly simple program so it only has one question opportunity and only a couple of possible questions. If students still have time then they could be challenged to find a way to give more than one question opportunity or add further questions and answers. Another idea is for the students to program a combination of questions for the student and opportunities to answer questions like in a conversation, they could also look at getting the answers from a text file and possibly they could use the text file to allow the program to 'learn' by storing answers given by students to questions and then use those to respond when it is asked that question later!


Plenary

Show some examples of students programs to the class and use them to highlight the key parts of the program.


Further Resources

Since writing this I have found a short scheme of work based on the turning test on the Rapspberry Pi website http://www.raspberrypi.org/learning/turing-test-lessons/. This 3 lesson scheme explains the idea of the Turing test and uses a speech module to have the robot speak to you.