Saturday 28 March 2015

Getting Physical with Python

This session with my HomeEd group I introduced some physical computing using the CamJam Edu kit .

Last time I blogged about the challenge of teaching a group with my own children (who are not used to a classroom environment). This week I had the additional challenge of my normal child swap falling through so I ended up with 3 of my own children to contend with.

With this in mind my plan was for more independent work with some supporting materials to make it easier for the children to work without my direction all of the time.

I also planned to manage the situation by placing my offspring carefully either side of me in the room so I could switch between the instruction and paying them attention. This worked much better for me to be able to manage the session. Although at one stage it did mean carrying 2 of my children whilst trying to explain things on the board (using my daughter to point out the relevant bits whilst I talked).

I had decided that I did not want to over simplify things for the children by using scratch. We had also been mainly working at the command line so it made sense to progress with this and use nano to create python files to control the components. This also followed the CamJam worksheets so I could use those to provide additional guidance so the children could refer back to the instructions.

As the group is very mixed (5-15) there was a range of experience in the group but most had not used electrical components in a breadboard before. After a quick introduction they were all setting up the simple LED and resistor circuits.

Most of the group managed to get as far as getting the lights lit but it did take some time to get there. The real limiting factor I found with using python with the younger children was the speed they were able to type the code was very slow compared to the older students (as they are still working on their reading skills this is actually quite a hard task).

There was no apparent problem understanding the concepts and adding text to control things, but the amount of text that needed reading and adding to the code was a problem. To make it easier for these younger students (and any students who find reading / typing difficult) it would be useful to reduce the volume of typing that is required to produce a result.

That said nearly all of the children had at least lit the LEDs by the end of the session even if this had involved a bit of help with typing from the adults in the room.








Wednesday 25 March 2015

Sheffield Raspberry Jam hosted by BCS South Yorkshire Branch



Last year the BCS South Yorkshire Branch decided it would be good to host some more interactive events. We normally host speakers and have a very similar audience for our talks. The idea was to bring some more people to the BCS and do something a bit different.

There had not been a Raspberry Jam in Sheffield for sometime and this looked like a good way to engage with a different community of computing enthusiasts. We set the event up for our march event which would also fall in science week.

A call out on twitter and contacting some of our contacts in the Raspberry Pi community gathered a few projects together to get things rolling. The crowd favorite was the Scalectrix set up that could be controlled by an attached variable resistor, code on the Raspberry Pi, or over the internet.

Paul from Pimoroni also came along to show of some of their products and talk about what can be done with hardware projects.

The event drew in a fair number of new faces and there was a great deal of questions for the showcase projects and this led to discussions and questions between the attendees about what they had seen and about interesting ways to use the Raspberry Pi and to teach people about computing.

There is also a renewed enthusiasm for Raspberry Jam in Sheffield and there is now a team keen to run regular events (the next is on the 28th of march at the access space and can be booked here) the twitter account is back up and running @PiJamSheffield  where you can find information about future events.

You can find out more about future BCS events in South Yorkshire on the Branch Website.


Raspberry PI Set up and Hello World

Having introduced the basics of computing this week the plan was to get the children setting up the Raspberry Pi and starting to program.

There was a little setup confusion with the venue meaning that there was a delay getting everything out and a much less ordered start to the session as kit was quickly located and brought out to us. We also discovered that the table arrangement we were using did not allow for enough power outlets to be available. A swift rearrangement of table and we had everyone near enough to a plug or extension to get running.

This was a completely new experience for me as I am used to arriving in my classroom that has all of the kit stuck on tables ready for me when I arrive and just moving cables around between the desktop and the Raspberry Pi. With the help of Jeremy (one of the other parents with an IT background) and Hamish from the University of Sheffield (who had come to see the Pi Bank kits in action) the children were all eventually up and running.

After the kit arrived we had to tackle the normal issue of setup with failed memory cards and trying to sort out which display option would work best. This was where the Pi Bank kits really helped. The range of connection options included meant that even with a collection of different monitors of differing ages we were able to get all the children connected and logging in.

It was at some point during the effort to help all of the children that I was shown the possible horror of teaching my own children. As a Home Educator I spend a large amount of time teaching my own children but not normally in a large room with other children to share the attention.

I think the sharing of my attention is something that will remain a challenge for my children to get used to and for me to work around, I need to find a way to channel my children's natural desire for my attention in a way that does not affect the groups progress.

Despite these distractions and the issues with the equipment we did manage to get everyone logged in to the Raspberry Pi. In fact we managed to move on and get the children started with looking at the file system and starting to create basic programs. The children used `ls` to look at the files and folders then `mkdir` to make their own folders. We then had a go at the traditional "Hello World" program in python using nano. Some even managed get the program prompting for user input.

Overall we managed to achieve the objectives even if it was not as calm and organised as I would have liked. However comparing this with the setup lessons I have taught in school is a favorable comparison,  We normally teach this in Y8 (12/13 year olds) and I find that normally I can expect to only get the class as far as making their own folders then needing to pack up the kit. This is with all the extra parts all setup on desks. With our mixed ability and age group we have been able to progress to making a simple program. With some tweaks to the organisation and a reduction in distractions i am expecting the group to progress fairly quickly.

To this end I have changed the setup plan for next week and the tables should be arranged near to the power outlet and the venue have promised to have the monitors, keyboards and mice setup and waiting for us. This should allow us to get started more quickly and move on to the physical computing experiments I have planned. I have also purchased and burned a fresh set of SD cards that I will be assigning to the children to use and save their work on each week.

Friday 13 March 2015

Basic Computing for Sheffield Home Educators

I have been talking for some time about starting a Computing / STEM group for home educated children in Sheffield.

We home educate our 7 year old son and there is a large community of home educators around Sheffield. As computing can be quite equipment heavy it is not something that is easy to do at home and until now there hasn't been an alternative.

The difficulty was finding a venue and some equipment that I could use to run the sessions. Fortunately I had seen a post about a lending library of Raspberry Pi equipment that had been set up at Sheffield University - The Pi bank. Fortune was smiling on me as this also led to the rediscovery of the Access Space. They charge for the space but they have an ideal flexible teaching space ideal for this sort of group. A few phone calls, Facebook posts and  emails later and the group was all set


The group is a very mixed group with children raging from 5 to 15 with a different levels of prior knowledge of computers and programming. This presented a different challenge to my normal classes but makes for interesting class dynamic. I say class but the plan is to try and not be too school like and see how we can follow the children's interests as we progress. We have a few structured 'lesson' type activities planned but after that I am hoping to split the group down and work on projects that they are interested in.

Today I ran the first ever session with a focus on how computers work and an introduction to algorithms.

First we made a human computer with the children forming the components and passing information around the computer to first perform simple sums.

The children took on roles with one student acting out each part of the computer and several (the more active and excitable younger boy mainly) passing the information between the components.

The user, although not too keen to hold on to the human mouse moved the mouse around our ( A4 paper calculator display) and the mouse driver reported the position. This was passed to the processor stored in memory and also displayed on the monitor (children with with pencil and paper and whiteboard and maker respectively).

This was repeated for each of the movements of the mouse to complete the sum. I had planned on simple single digit arithmetic for our volunteer processor but the user had other plans (I did managed to keep it to 2 digits, but I think she would have gone for more if left to her own devices). The processor then calculated the answer and passed that to the monitor and memory. In this case the user forgot to save (or i forgot to ask her to) so we talked about what would happen to the information and then pretended we had saved to pass the information to the hard drive (child with paper and a pen).

After that we simplified our computer, using just a camera/computer combination, lots of willing active information conduits and a rather excitable printer (my Son Toby) we experimented with how computers see and describe images using binary. To keep things simple we used a simple 1 bit black and white image which was only shown t the camera. The camera passed the appropriate 1 or 0 (each an A4 sheet with 1 on the front and black on the back or 0 with white on the back) to the information carriers and the printer started to put the image together on the floor.

This was really great as the first stages were rendered accurately but as the information carriers gained confidence and enthusiasm we started to see some arriving out of order which corrupted our image slightly. This gave us an opportunity to talk about the importance of the data arriving in the correct order.

After getting everyone sat down again I introduced our next activity which was the Sandwich making robot by Philip Bagge. I explained the activity and handed out the sheets to allow the children to plan their Sandwich making algorithms.

After donning the special robot uniform (pink pinny borrowed from home) i took on the role of robot and we tested some algorithms. We didn't get as far as a full sandwich but we learnt some good lessons about how to think through a problem. We also took the opportunity to talk about debugging.

I was surprised by how many children though that cutting the bread bag was the way to open bread until at the end someone mentioned there was no open on the instruction set. I thought this was an error until I watched the videos again and noticed that Phil starts with his bread bag open.

Overall I am quite happy with how the session went and the children seemed for the most part engaged in the activities. I now need to go off and plan for next weeks introduction to Raspberry Pi and programming in python.




Thank you to Computer Science Unplugged and Philip Bagge for the inspiration for the activities for the session.

This is Phil in action -




Outtakes - https://www.youtube.com/watch?v=leBEFaVHllE - very good lessons on how important it is to get the algorithm correct.

Tuesday 10 March 2015

Agobo The Hackable Raspberry Pi Robot - now with emotions





My wife gave me a 4Tronix AgoBo robot kit for Christmas (at my request). I built it a few weeks ago but didn't really have time to do anything with it.

The AgoBo is a Raspberry Pi A+ based robot kit. I also ordered the Plus plate that adds a Neopixel and lots of prototyping space on a board that mounts above the RPi. The kit is a really good affordable robot kit that can be customised very easily, especially with the PlusPlate. It is this customisation that really attracted me to the AgoBo in the first place.

When the robot arrived I thought that the Ultrasonic sensor looked like a pair of eyes but AgoBo was lacking a mouth. On another evening I was rooting through a box of electronic bits I bought for RPi projects and found an 8x8  LED matrix. 


I had seen plenty of robot that used these as eyes and thought that this could work. However with the robot being so small the matrix was far too large. I had another dig in the box and found a more suitably sized replacement.


The 5011AS display fitted just below the ultra sonic sensor with the pins above and below the main board. Aligned horizontally the segments could be used to make a smile or sad face by lighting the correct segments.

This idea was put on the back burner for a couple of weeks whilst normal life got in the way. and I kept thinking about how to mount the module effectively under the board. When I was able to experiment with the robot again (finally loaded the example software and tried out the supplied python scripts) I couldn't resist having a try with the mouth idea.

I haven't found time to solder the header on to the plus plate yet and wanted to get the mouth working so I grabbed a breadboard and some cables to try it out before I sorted it all out properly.



I had a ten cable female to female ribbon so I divided that into two (5 cables in each) to connect the ten pins of the display. With the ends of the cable connected there was very little room between the pins but with a little blue tack the display mounted nicely with two pins each side below the board and three above. To keep things tidy I separated the first part of the cable for a short length and then wrapped the cable up over RPi and under the PlusPlate (with a little Blue Tack of course).







I then grabbed a few resistors and connected the cables to the breadboard and then connected the other side to the header I fitted to the main board (in preparation for connecting the plus plate).


This is where I ran into my first problem.limited time and a failure to read instructions lead to an error in the connections. Instead of looking at the instructions I looked at the numbers on the top of the PlusPlate and reading down from the top started used the first available pins. Unfortunately these pins are already in use by AgoBo so there was a bit of a conflict when I tried to use these to run the mouth.

So looking back at the instructions I made a list of the pins that were in use and looked again at the PlusPlate for available pins and moved the connections to pins that were not already in use by AgoBo.

Once I had the connections all set up (correctly this time) I needed to set up- the code to run the mouth and control the facial expressions. I decided i wanted a smile (obviously, what's cuter than a smiling robot?) a sad face, a confused face and an open mouth, After this time consulting the instructions (the data sheet from Jameco) I drew a little diagram of which pin controlled which segments of the display and worked out a little table of which should be displayed for each facial expression.



With this organised I set up a Python library (mouth.py) to set up the facial expression and then a quick script to test the expressions. The test script (mouthtest.py) shows each expression I have set up so far. the smile, sad face and 'oh' i am really pleased with. I am not to sure about the confused face so I may not use that very much. These scripts will be available from my AgoBo Github fork here.


.



With the mouth working I wanted to work the expressions in to the normal running program for Agobo. I had written a quick script previously for him to avoid objects using the ultra sonic sensor so I used this as a starting point.

I ran into a small issue here as I had set up the mouth library using GPIO numbers and the AgoBo library is set up using board numbers. after a little head scratching (I am still unsure why in an error state the face always seems to display 'oh') I spotted the error and changed the mouth library to match the python library and now Agobo will avoid objects whilst displaying his emotions.

Currently he is happy moving forward until he finds an object. This makes him sad and he turns 90 degrees. If he can move forward he is happy again. If instead there is another object in his path he is shocked / cross ('oh') and turns 180 degrees. Again if the way he clear he is happy again and proceeds. However if there is another object he becomes confused (or his face does) and then turns 90 degrees (away from the initial object and proceeds on his way happy again.