Showing posts with label Coding. Show all posts
Showing posts with label Coding. Show all posts

Wednesday, 13 January 2016

GPIO Zero - making coding less language intensive.




In a previous post Getting Physical with Python I wrote about the difficulty some of the younger children in my computing group had with the volume of typing required to get started with physical computing. They did not struggle with understanding but it took too much time and help to enter the volume of text required. This took away some of the excitement and slowed things down.

At the time I thought to speed this up it would be good to write a python library to reduce the amount of text needed to get things to happen. Unfortunately I had lots of other things to do and this never went anywhere. However someone else also thought it would be good to make it easy to get started with physical computing and was able to do something about it.

That person was Ben Nuttall of the Raspberry Pi Foundation. Along with Martin O'Hanlon and Dave Jones he has created GPIO Zero. You can read his account of how it happened on his blog.

This python library can be used to very simply control components using the GPIO pins. The initial function set is based around the popular CamJam EduKits  (Kit 1- Starter, Kit 2 - Sensors) and makes a great starting point for physical computing using python.

A simple light and button combination can be controlled with the below example:

from gpiozero import LED, Button led = LED(15) button = Button(14) button.when_pressed = led.on button.when_released = led.off

instead of something like this:

import os 
import time  
import RPi.GPIO as GPIO

GPIO.setmode(GPIO.BCM) 
GPIO.setwarnings(False) 


GPIO.setup(14, GPIO.IN)
GPIO.setup(15, GPIO.OUT) 

while True: 
        if GPIO.input(14) == False: 
              GPIO.output(15, HIGH)   
        else: 
               GPIO.output(15, LOW)  

 time.sleep(0.5) 

The reduction in volume of code and setup required is brilliant. GPIO Zero is an amazing tool for education. This is especially true where the volume of text entry is a barrier (either with younger or SEN children).

The tool allows the focus to be on the programming concepts and not on the entry of text. When i worked with my HomeEd computing group there was a difficulty fro a number of the students (aged between 5 and 15) in using the GPIO library as there was a lot of code to enter. They were generally happy with what they were trying to achieve but found that it took a long time to enter the lines of code required just to light up the LEDs.

This meant that in the one hour session that is what we achieved, lighting up the LEDs. Whilst this was a success and the children were happy with getting there it would have been much better to spend more of the time in the session exploring what could be done rather than entering lots of text.

GPIO Zero takes away some of the burden allowing children to focus on what they are trying to achieve rather than on copying out lots of lines of code (especially the set up parts that are conceptually more difficult to grasp and result in questions about what is BCM etc).

I have found that where I have used this it has meant I can more on more quickly and cover more of the computational thinking ideas where previously there would have been more time waiting for the students to catch up with the typing required. It also works well to satiate the desire for instant gratification that appears to be fairly common among my pupils. They only have to spend a short time entering code before they can see a result.

It is also much easier in a classroom to debug the code they have written if there are errors. The reduced volume of code makes for less searching to find the capitol that should't be there. This make students more able to do it themselves or makes it quicker for me when they can't see what is wrong. The reduction in time taken here give me the opportunity to get to more pupils and help them to progress.



I have also used this at home with my son (8) whilst he has been creating a robot using the CamJam EduKit 3 - Robots. This was really powerful because it allowed him to achieve results in short pockets of time before he lost focus and wanted to move on. He used the provided worksheets to set up the robot and connect the components and I translated the code parts into GPIO Zero for him to get the robot working.

So in summary the feedback is - Thanks Ben this is an awesome tool to help me teach computing.

If you are interested in using GPIO Zero there is a great getting started guide on the Raspberry Pi website in the resources 'Learn' Section.

More information can be found on pythonhosted.org or on GitHub. There is also a Google Doc with information and a place to add comments / requests.

Friday, 24 July 2015

The PiCycle an amazing student project



I have had the pleasure this year to have been a supervisor for one of our sixth form students doing an Extended Project Qualification (EPQ).

The EPQ is a level 3 qualification (similar to A-level) for students to do something that they are interested in. This can be an essay or an artifact and can take a myriad of forms. In this case my student chose to do a Raspberry Pi based project which was how I became involved.

Dan had no previous experience of the Raspberry Pi or Python before starting the project. He had done some web development but wanted to try something new. He spent a large portion of the project time mastering the basics of Python and the basic hardware before moving on to getting the project up and running.


The result is the PiCycle a Raspberry Pi based cycle computer that takes the position of the rider an plots it on a map on a website. The idea was to use the device to prove the designer had completed a planned charity cycle ride across France but could be used for all sorts of tracking applications. As well as getting the device working he spent some time getting it to look good too with branded interfaces on the device and the web tracking page.





The student developed the program to collect the GPS position and store this in a file on the Raspberry Pi and then upload the data to the web to display the position(s). The design means that whilst the cyclist is in areas of poor phone reception they can still log the GPS position (assuming GPS signal) and then upload the data when the phone signal is regained.

This project was a great achievement especially considering that Dan had no prior knowledge of the Raspberry Pi or Python before he started the EPQ.

Some more information about the project can be found in his project presentation. He can be found on twitter: @DJWOOLFALL.

Friday, 17 April 2015

The Sound of Music - Sonic Pi with The Home Ed Computing Group

With the last session I had discovered the difficulty of the volume of text required when using python and the CamJam Edukits for some of the younger children. I have been working on a python library to help with this but this week i decided to use a different tool to look at coding concepts.

Sonic Pi is a great tool because it allows you to teach coding and make music at the same time. It was also a good way of being able to progress in complexity with computing concepts (introducing iteration) but keeping the level of text entry required to a minimum.

Sonic Pi is freely available software as part of the standard Raspian build and is also available for other platforms (more information here on the Sonic Pi Website)

The simple nature of the commands required for the children to be able to make music means it is a really good tool for younger (or children less able to read / type).

Some of the barriers I had found in the Python session had been easily overcome by the simple nature of the commands. I think would have struggled with getting some of the children to start looping blocks of code in Python yet they were all happily making repeating tunes with Sonic Pi. I really wanted to avoid simplifying things too much and this has provided a great bridge between the simple block programming that some of the children have done before and text based languages.

Sonic Pi itself is based on Ruby so it teaching a specific code structure and language that will continue to be useful. O so it is simplified in that the tasks it is performing are much more complex than the Play and sleep combinations, but it is a really engaging and relevant way to work with code.

I have written lots before about Sonic Pi so i will try and avoid this becoming the Sonic Pi fan page. However in reducing the volume of typing required I have found another way in which Sonic Pi makes coding accessible to a wider audience.

The group seemed to really enjoy this session and it was one of the most requested areas to do more with.

I am starting to look to move away from just leading the sessions and i already have several children with ideas for their own projects. This is an idea I am really keen to explore as i feel the best way to learn about computing is by finding challenges and solving them using computing. If children can find something that interests them they are much more motivated to explore than if they are being directed by someone else.

Saturday, 28 March 2015

Getting Physical with Python

This session with my HomeEd group I introduced some physical computing using the CamJam Edu kit .

Last time I blogged about the challenge of teaching a group with my own children (who are not used to a classroom environment). This week I had the additional challenge of my normal child swap falling through so I ended up with 3 of my own children to contend with.

With this in mind my plan was for more independent work with some supporting materials to make it easier for the children to work without my direction all of the time.

I also planned to manage the situation by placing my offspring carefully either side of me in the room so I could switch between the instruction and paying them attention. This worked much better for me to be able to manage the session. Although at one stage it did mean carrying 2 of my children whilst trying to explain things on the board (using my daughter to point out the relevant bits whilst I talked).

I had decided that I did not want to over simplify things for the children by using scratch. We had also been mainly working at the command line so it made sense to progress with this and use nano to create python files to control the components. This also followed the CamJam worksheets so I could use those to provide additional guidance so the children could refer back to the instructions.

As the group is very mixed (5-15) there was a range of experience in the group but most had not used electrical components in a breadboard before. After a quick introduction they were all setting up the simple LED and resistor circuits.

Most of the group managed to get as far as getting the lights lit but it did take some time to get there. The real limiting factor I found with using python with the younger children was the speed they were able to type the code was very slow compared to the older students (as they are still working on their reading skills this is actually quite a hard task).

There was no apparent problem understanding the concepts and adding text to control things, but the amount of text that needed reading and adding to the code was a problem. To make it easier for these younger students (and any students who find reading / typing difficult) it would be useful to reduce the volume of typing that is required to produce a result.

That said nearly all of the children had at least lit the LEDs by the end of the session even if this had involved a bit of help with typing from the adults in the room.








Tuesday, 10 March 2015

Agobo The Hackable Raspberry Pi Robot - now with emotions





My wife gave me a 4Tronix AgoBo robot kit for Christmas (at my request). I built it a few weeks ago but didn't really have time to do anything with it.

The AgoBo is a Raspberry Pi A+ based robot kit. I also ordered the Plus plate that adds a Neopixel and lots of prototyping space on a board that mounts above the RPi. The kit is a really good affordable robot kit that can be customised very easily, especially with the PlusPlate. It is this customisation that really attracted me to the AgoBo in the first place.

When the robot arrived I thought that the Ultrasonic sensor looked like a pair of eyes but AgoBo was lacking a mouth. On another evening I was rooting through a box of electronic bits I bought for RPi projects and found an 8x8  LED matrix. 


I had seen plenty of robot that used these as eyes and thought that this could work. However with the robot being so small the matrix was far too large. I had another dig in the box and found a more suitably sized replacement.


The 5011AS display fitted just below the ultra sonic sensor with the pins above and below the main board. Aligned horizontally the segments could be used to make a smile or sad face by lighting the correct segments.

This idea was put on the back burner for a couple of weeks whilst normal life got in the way. and I kept thinking about how to mount the module effectively under the board. When I was able to experiment with the robot again (finally loaded the example software and tried out the supplied python scripts) I couldn't resist having a try with the mouth idea.

I haven't found time to solder the header on to the plus plate yet and wanted to get the mouth working so I grabbed a breadboard and some cables to try it out before I sorted it all out properly.



I had a ten cable female to female ribbon so I divided that into two (5 cables in each) to connect the ten pins of the display. With the ends of the cable connected there was very little room between the pins but with a little blue tack the display mounted nicely with two pins each side below the board and three above. To keep things tidy I separated the first part of the cable for a short length and then wrapped the cable up over RPi and under the PlusPlate (with a little Blue Tack of course).







I then grabbed a few resistors and connected the cables to the breadboard and then connected the other side to the header I fitted to the main board (in preparation for connecting the plus plate).


This is where I ran into my first problem.limited time and a failure to read instructions lead to an error in the connections. Instead of looking at the instructions I looked at the numbers on the top of the PlusPlate and reading down from the top started used the first available pins. Unfortunately these pins are already in use by AgoBo so there was a bit of a conflict when I tried to use these to run the mouth.

So looking back at the instructions I made a list of the pins that were in use and looked again at the PlusPlate for available pins and moved the connections to pins that were not already in use by AgoBo.

Once I had the connections all set up (correctly this time) I needed to set up- the code to run the mouth and control the facial expressions. I decided i wanted a smile (obviously, what's cuter than a smiling robot?) a sad face, a confused face and an open mouth, After this time consulting the instructions (the data sheet from Jameco) I drew a little diagram of which pin controlled which segments of the display and worked out a little table of which should be displayed for each facial expression.



With this organised I set up a Python library (mouth.py) to set up the facial expression and then a quick script to test the expressions. The test script (mouthtest.py) shows each expression I have set up so far. the smile, sad face and 'oh' i am really pleased with. I am not to sure about the confused face so I may not use that very much. These scripts will be available from my AgoBo Github fork here.


.



With the mouth working I wanted to work the expressions in to the normal running program for Agobo. I had written a quick script previously for him to avoid objects using the ultra sonic sensor so I used this as a starting point.

I ran into a small issue here as I had set up the mouth library using GPIO numbers and the AgoBo library is set up using board numbers. after a little head scratching (I am still unsure why in an error state the face always seems to display 'oh') I spotted the error and changed the mouth library to match the python library and now Agobo will avoid objects whilst displaying his emotions.

Currently he is happy moving forward until he finds an object. This makes him sad and he turns 90 degrees. If he can move forward he is happy again. If instead there is another object in his path he is shocked / cross ('oh') and turns 180 degrees. Again if the way he clear he is happy again and proceeds. However if there is another object he becomes confused (or his face does) and then turns 90 degrees (away from the initial object and proceeds on his way happy again.




Wednesday, 19 November 2014

Raspberry Pi CPD in Sheffield


Part of the idea of Picademy was that delegates would go out and spread the word. So as part of my effort I spent this evening delivering Raspberry Pi CPD to the Sheffield CAS hub at Sheffield Hallam University.

The session was mainly made up of Computing ITT students from Sheffield Hallam University so it was really interesting to see a different approach to new information (from my secondary pupils). Most of the delegates had little or no exposure but there were two who had used the Pi for their own projects (a security camera and a remote media center).

After a quick introduction to the Pi we spent time describing how the Pi could be set up in classrooms and introduced the Raspberry Pi foundation's resources. Preparing for this event gave me the chance to look again at what is provided and the resources there really do give all the information that you need to get started using the Raspberry Pi and moving on to using it productively in the classroom.

After the set up and a brief summary of some of activities available for using the Pi in the classroom I spent some time focusing on some of my favorite schemes. as a parent of a 7 year old I can't avoid minecraft at home and have found that it is equally as engaging for secondary pupils. I shared some of Craig Richardson's resources from his blog on Minecraft Pi - Arghbox. the delegates were also given a chance to try out some of the scripts on the Pi's they had set up. This may have been a mistake with some of the more game obsessed ITT students (mainly male). This was hastily used to point out the importance of choosing classes and classroom management strategy carefully when using a game students are already familiar with.

We then looked at some of the other ideas I have used in the classroom. The use of Sonic Pi (paticularly as an application that appears to appeal more equally to both genders) to engage students creativity and teach programming in a fun way. We also looked at the possibilities using the GPIO pins for physical computing. I am very interested in 'Personally Meaningful Projects' as a key motivator for students to get involved in programming and the GPIO pins provide this possibility. I shared some example of projects my students and students from further afield have created using the Pi. We also discussed the support available from the community.

The great thing about the ITT students is that once they had a spark of an idea they appeared very enthusiastic to take this on and try using this in their teaching practices. Several were keen to borrow the university Raspberry Pi set and some were talking about purchasing their own and projects they could work on. Hopefully this talk will be converted to action and there will be a few more computing teachers in Sheffield schools enthused about the benefits of using Raspberry Pi in the classroom. If nothing else I did a little Picademy product placement and did my best (if not quite 'The Apprentice' level) pitch for the resources available on the Raspberry Pi site. I left the event feeling buzzy and motivated to do it again so it can't be all bad.

The Prezi I used to as a place holder for the introductory videos and some links for the resources we discussed is here.



As a side / end note this was a chance to play with some presentation tech I can only dream of in my classroom. I had a Pi on one button, the Prezi on another and a visualiser showing the actual Pi on a third. This was the first time I had used the set up at the university and I was very pleased with the possibilities. At the press of a swanky touch screen I could switch between the projected picture of my hands doing magic with the Pi and the actual out put of the Pi, then switch to the diagram on the Prezi showing the possible connections. This made the screen work hard for me and really helped to show what was going on. The only downside was managing multiple mice and a second keyboard a few paces away (due to the university padlocked setup) and talking at the same time. I don't imagine I will be getting this sort of system in my classroom anytime soon but it was good to try it out for an evening.

Monday, 8 September 2014

Conversations with computers using python


I had the idea of making the standard 'Hello World' introduction to programming a new language a little bit more interesting for my Y7 class.

The idea of the computer conversations and the recent Turing test success or (near success) by Eugene gave me the idea of getting the students to make a (vastly simplified) version of Eugene using Python. The plan would be to teach them some basic Python concepts like displaying information and filling variables based on user input and then maybe progress onto selection.

This would initially start as a very simple program with them entering their name and then including the name in the response. The students could then work up some complexity from there using more questions. The next step would be to switch things round and have the computer answer user question based on a list of pre-programmed responses.

The Plan

Introduction

Show the class a video by way of introduction to the test. Something like - Jeremy Clarkson Explains the Turing test or The Turing test, as described by Expect Labs CEO, Timothy Tuttle.

Explain the plan to create a basic chat bot that can have a basic conversation with the user.

Task 1- Hello World

Students to open IDLE and the a new window. They then create a basic 'Hello World' program and save and run.

e.g. 

print ('hello world!') 

Students to experiment changing hello world for whatever greeting they choose. (Yes they will probably make it say rude words!)

Task 2 - Talk to me

Obviously this is a pretty one boring conversation so we need to add in the ability for the user to input information.

e.g.
myName == input('What is your name?')
print ('hello ' + myName)

Students to experiment with this and then try adding more questions.

e.g.
myName == input('What is your name?')
print ('hello ' + myName)
myColour == input('What is your favourite colour?')
print ('That is amazing ' + myName + ', ' + myColour + ' is my favourite colour too')

Task 3 - Selection

To try and make the computer's responses a little more realistic it would be good if the response wasn't the same what ever you type so we can add add selection to change the response based on what is input.

This can start with a simple if else:

e.g.
myColour == input('What is your favourite colour?')
if myColour == 'orange':
    print ('That is amazing ' + myName + ', ' + myColour + ' is my favourite colour too')
else:
    print ('It is nice that you like ' + myColour + myName + ' I prefer orange')

This can then be moved on to add more choice using else if (elif):

e.g.
myColour == input('What is your favourite colour?')
if myColour == 'orange':
    print ('That is amazing ' + myName + ', ' + myColour + ' is my favourite colour too')
elif myColour == 'black':
    print (myName + ' your are strange ' + myColour + ' is not even a real colour, how can it be your favourite?')
else:
    print ('It is nice that you like ' + myColour + myName + ' I prefer orange')

Task 4 - Ask me a question

This basic idea can then be used to switch things around and let the students ask questions. This will only handle a pre-programmed list of questions and answers but completes the very basic conversation idea.

This can be added to the first code or used to start a new program. if in the same program some of the old variables can be used to add more interest.

First the computer needs to prompt the user to ask a question:

myQuestion == input('Ask me a Question?')
if myQuestion == 'how old are you':
    print ('I am 12, how old are you ' + myName + ' ?')
elif myQuestion == 'What is your name?':
    print ('My name is Simon')
else:
    print ('Sorry i didn't understand that question')

Students experiment with their own versions.


This is a fairly simple program so it only has one question opportunity and only a couple of possible questions. If students still have time then they could be challenged to find a way to give more than one question opportunity or add further questions and answers. Another idea is for the students to program a combination of questions for the student and opportunities to answer questions like in a conversation, they could also look at getting the answers from a text file and possibly they could use the text file to allow the program to 'learn' by storing answers given by students to questions and then use those to respond when it is asked that question later!


Plenary

Show some examples of students programs to the class and use them to highlight the key parts of the program.


Further Resources

Since writing this I have found a short scheme of work based on the turning test on the Rapspberry Pi website http://www.raspberrypi.org/learning/turing-test-lessons/. This 3 lesson scheme explains the idea of the Turing test and uses a speech module to have the robot speak to you.