Reading Radiation

At the altitude that the balloon will reach it will be bobbing around in the stratosphere – a layer of the atmosphere between 20km and 50km above the Earth’s surface which comes with some unusual features. It has been found that certain species of bacteria call the stratosphere home, as does the odd massively ambitious bird. The stratosphere also has a funky temperature profile – it actually gets warmer the higher up through the stratosphere you go.

This counter-intuitive temperature situation is down to the way the gases in the stratosphere interact with radiation. Ozone (O3) in the high stratosphere absorbs most of the sun’s high energy ultraviolet (UV) radiation with wavelengths between 100 and 315nm – UVB and UVC for the SPF-savvy among us. This has the effect of protecting life on Earth from some nasty mutagenesis caused by the UVB and UVC interacting with DNA, leading to all sorts of cancery havoc. In the process, the UV breaks down the O3 into a pair of oxygens, i.e. bog-standard molecular oxygen (O2), and a singular oxygen atom (O1).

Further down the stratosphere, once the UVB and UVC have been absorbed, the O2 and O1 can recombine to form O3 again and this process generates heat. Below this level the stratosphere receives very little UVB and UVC so the O3 remains intact and since there’s no O2 and Ocombining then no additional heat is generated. Overall this means the stratosphere starts off at the bottom a seriously chilly -60oC and by the time you get to the top the temperature has reached almost 0oC. Toasty.

So travelling up through the stratosphere means that our payload will experience increasing amounts of UV radiation. In addition it will experience more of other sorts of radiation as the amount of shielding provided by the atmosphere decreases. Cosmic rays are a type of radiation made of particles, their origin unknown. These particles play in the atmosphere and interact with the particles there, creating all sorts of secondary particles. Some of these eventually reach the ground and give us all a dose of radiation.

People who work in environments where their exposure to radiation could be higher than normal (such as in hospitals, nuclear power stations and aircraft) can wear badges, called film badge dosimeters, that record exposure to this kind of radiation by examining the changes to photographic film in the badge. The photographic film changes when hit by radiation, so when the film is developed it is possible to see how much radiation the badge (and therefore the wearer of the badge) has been exposed to.

We will use a similar process to record the balloon and payload’s exposure to radiation. We’ll send up some photographic film in a bag that will stop light getting to the film but that won’t stop the cosmic rays or it’s secondary particles. We’ll cover part of the film with a message written in lead. The lead will prevent the film underneath it from being exposed by the cosmic rays, and this will create a pattern on the film which will be revealed when it is developed into a photograph. Cosmic!

The Story of a Box

Where possible, we’ve been going back to basics and learning the techniques and technology required to manufacture items for the project ourselves. Here’s the story of a simple box.

We really didn’t know where to start manufacturing apparatus for some of the science experiments, so we were really lucky to be chatting about it at Bristol Hackspace when a friend, Joe, overheard our discussions and offered to help out. Little did he know where that would lead him!
image

We started with the Pressure Art demonstration. The concept was a series of small balloons filled with different substances to demonstrate how they reacted at different pressures, eventually bursting to create a Pollock-esque painting…space art if you will 🙂 The whole process would be filmed.

image

image

image

image

Once we got talking about it, we realised how involved the design could be even for a simple box. Given the freedom to design it as we wished, a world of possibilities opened up.

These are some of the considerations for the design:

  • Material
  • Creating the art
  • Camera angle
  • Spillage
  • Balloon fixings
  • Strength
  • Lighting
  • Other design considerations

image

Material
We fast decided upon acrylic/Perspex as we could cut this using Hackspace’s lasercutter and it provided us with good visibility for filming as well as being strong and relatively inexpensive.

Creating the art
Whilst we wanted to demonstrate the behaviour of differing substances at low pressure, it was a key objective to produce an item of space art so we had to ensure the balloons and paper were place in a way that they the paper would catch optimal ‘splatter’. We planned to use blotting paper so it would absorb any liquid readily and accurately capture splatter patterns.

Camera angle
This was difficult to decide upon – we could film it from any angle, but had to be compatible with the lighting and the artwork produced. Initially we planned to film from above, but we realised that liquids could pool on the bottom which would spoil the artwork if the paper was there.

Spillage
As well as pooling of any liquids, we had to ensure the container would not leak. Though it may require drainage to ensure the camera was not obscured.

Balloon fixings
We had initially planned to fix the balloons to the base as we thought we may have to use an inverted premade case. During our discussions with Joe, we considered just leaving them loose in the container but we eventually decided upon balloons hanging from the top – this was for aesthetic reasons as well as practical ones – being able to identify which balloon (and containing substance) was which and also being able to rig them to ensure they exploded (without compromising other items in the payload).

Strength
The box as a structure needed to be strong enough to withstand the pressure of the balloons expanding inside.

Lighting
How to light the box perplexed us somewhat. Lighting was needed, but could be a benefit and hindrance depending on the style and placement. Whilst we liked the idea of LEDs that hung amongst the balloons or behind them embedded in the lid, we felt this might compromised the viewing quality too much. To ensure we get maximum ‘artwork’ along the sides, we have decided that the best use of the LEDs will be to mount or embed them in the base of the main chamber. We may need to avoid light spillage so that they don’t cause flare. We will conduct experiments to see whether surface mounted LEDs or normal LEDs would be better and see if we could disperse some light into the acrylic itself.

Other design considerations
At least two sides of the box had to sit flush with the outer polystyrene box so that it did not puncture itThe lid had to have a means of easily opening and closing it (we made the tabs longer on one side)
The lid needed to have a basic means of securing it so that it didn’t fly open easily. We made a rudimentary spacer/key that could be taped into place.

image

image

Protoyping and testing

To laser cut the box, we need to product CAD dxf files. Joe took this to a while new level by writing a PHP program that would draw the CAD files automatically at any scale. The benefit of this is that changes could easily be made and it also meant that we could first produce a small plywood model to prove the design. Here is that model as it took shape…

image

image

image

image

image

The last step of the design process was that we needed to test to see how the liquids behaved to confirm the placement of the camera and whether drainage was required. To do this, we needed to simulate a low pressure environment. That will be part of another blog…

Science Hack Weekend – Space Wax

Space Wax is an experiment in recording a 3D representation of the journey of the box.

image

Throughout the journey the box and it’s contents will be thrown this way and that. Space Wax is an attempt to record this chaotic journey not in terms of ascent and descent but in terms of the swinging and swaying jiggles and joggles it will experience.

image

The idea is to get something that will react to the small movements but will hold its shape and eventually set hard. We thought, how about molten wax?

We then thought that it would be better if the wax could do its thing whilst suspended, rather than just splashing about on the bottom of a container. Could we get the wax to remain suspended in a substance that would allow it to move and take its own shape, whilst also being viscous and dense enough to support it?

Test 1 – different waxes, different substrates

image

image

Wax Type Suspension Substrate Result
Carving wax Wallpaper paste [image]
Carving wax Hand sanitizer [image]
Carving wax Shampoo [image]
Microcrystaline wax Wallpaper paste [image]
Microcrystaline wax Hand sanitizer [image]
Microcrystaline wax Shampoo [image]
Bottle sealing wax Wallpaper paste [image]
Bottle sealing wax Hand sanitizer [image]
Bottle sealing wax Shampoo [image]

The considered opinion was that bottle sealing wax showed the most promise, so we did a few more tests with it trying to get it to sink nicely in to the substrate. One hypothesis was that if the substrate was warmer then the wax would stay molten longer and have more time to form a shape. Another was that a more diluted (aqueous) substrate would allow the wax to sink more.

image

Test 2 – Bottle sealing wax and even more different substrates

Suspension Substrate Result
Diluted wallpaper paste [image]
Hand sanitizer sandwich [image]
Diluted shampoo [image]
Really diluted wallpaper paste [image]
Warm water [image]
Cold water [image]
Olive oil [image]

So the results were a little inconclusive as we didn’t convincingly achieve actually getting the wax to sink into the substrate! We tried olive oil in an attempt to use a substrate that wasn’t aqueous and the result (a layer of wax swishing over the surface of oil) was distinctly not what we were expecting and certainly sub-optimal!

image

As something of an afterthought, we thought about how to play with the wax to make it more dense (rather than altering the substrate). A moment of inspiration – we had some left-over iron powder from the failed air-activated hand-warmers. So…

Test 3 – Wax with iron powder in a range of substrates

Suspension Substrate Result
Diluted wallpaer paste [image]
Hand sanitizer [image]
Shampoo [image]
Diluted shampoo [image]
Wallpaper paste [image]
Hand sanitizer [image]
Shampoo [image]

Progress!

image

image

Meeting Roundup: Technical Test Build: Photoblog!

Livephotoblogging today’s meeting 🙂

image

Badly behaving Raspberry Pi

image

Massive Yagi antenna David whipped up yesterday…this will help us diagnose our radio issues

image

Men at work

image

Quarterwave aerial with radials

image

The first rule of radio is…you can never have too many connectors!

The second rule of radio is…antenna is easier to spell than aerial

RANGE TEST
We had real problems in our range tests last times and didn’t know where to start diagnosing the issue. David stepped in to help us out via a process of elimination.

First of all he used a standalone radio receiver to ensure there were no problems with the signal. This is the black walkie talkie type device in the photo.

image

The yellow device is a walkie talkie for communicating with Will who was up to several hundred metres away (if you squint at the pic below you can maybe see Will in the distance).

image

image

image

The first tests weren’t much better than our previous ones achieving about 150m so David attached a Yagi antenna to the receiver. This has much greater gain (sensitivity) but is directional. This would help us identify whether the issue lay with the signal itself or elsewhere. Immediately our range increased to about 500 metres but we started getting corruption of data.
image

image

The next thing we did was swap transmitter antennas to identify if the problem lay there or elsewhere. This turned out to be the case. We could see differences between the aerials and anyone of these or all of them may be contributing to the range issues. Quarterwave aerials need quite specific dimensions so Will is going to have a look at this.

image

In addition to this, David noticed that the frequency separation wasn’t as good with our own hardware so we will be reviewing the resistors etc to see if we can improve this. A greater difference between the highs and lows of the audio tones that represent the ones and zeros being transmitted. Whilst the system isn’t broken as such, it’s desirable to make it a bit clearer in case we encounter noise.

Connecting to a USB 3G dongle with the Raspberry Pi

Seeing as we wont have all that much control over where our balloon payload ends up, we have to anticipate what we will do if it lands in the ocean/on top of a tree/somewhere else bad, and also the possibility that we will lose radio communication and fail to track the balloon to where it lands. To give ourselves an increased chance of retrieving the payload or at least some of our HD images in this situation, we will be attempting to communicate our GPS location and our images via a 3G internet connection upon landing.

The USB 3G dongles which are sure to be compatible with the Raspberry Pi  are documented here. We chose to use the Huawei E1750, available for around £10-£20 in the UK.

To get the dongle working, the following prerequisite steps are required:

1) Install ppp – (point-to-point protocol for Linux)

sudo apt-get update

sudo apt-get install ppp

2) Install and configure usb-modeswitch

By default, most USB 3G dongles have 2 modes – one mode in which the dongle acts as flash storage for installing drivers when the dongle is plugged in on Windows machines, and another mode in which the USB dongle can be used as a device. usb-modeswitch makes sure that the dongle switches into this second mode and not the first. It can be installed with the command:

sudo apt-get install usb-modeswitch

usb-modeswitch will run whenever the dongle is plugged in. However, it does not run on startup if the device is already plugged in. To have it do this, power down the Pi, plug the dongle in and start the pi up.

Now enter the command:

lsusb

A list of info about all USB devices plugged into the Pi will be displayed, there should be a number of the form XXXX:YYYY shown for the USB 3g dongle.

Create the file /etc/udev/rules.d/41-usb_modeswitch.rules with the following content:

ATTRS{idVendor}==”XXXX”, ATTRS{idProduct}==”YYYY”, RUN+=”usb_modeswitch ‘%b/%k'”

Now after powering up the Pi and typing the command lsusb, YYYY should be a different number.

3) Install and configure sakis3g

This is a script that we will use to connect the Pi to 3G via our USB 3G dongle.

Download sakis3g:
sudo wget “http://www.sakis3g.com/downloads/sakis3g.tar.gz” -O sakis3g.tar.gz

Unzip sakis3g:
sudo tar -xzvf sakis3g.tar.gz

Navigate into the unzipped folder and then make sakis3g executable:
sudo chmod +x sakis3g

4) Create a shell script “connect-to-3g.sh” to connect to the internet using sakis3g:

Create the shell script “connect-to-3g.sh” with the following content:

#!/bin/sh

SAKIS3G_LOCATION=”./downloads/files/fgsvmqk3jv8ajq2673u21hkbh4/targz/sakis3g”

MODEM=”12d1:1001″

APN=”3internet”

sudo $SAKIS3G_LOCATION connect APN=$APN OTHER=”USBMODEM” USBMODEM=$MODEM

Now, connect to 3g simply by plugging in the 3G dongle, waiting a few seconds for modeswitch to work, and then running the connect-to-3g script:

sudo sh connect-to-3g.sh

Remote retrieval of images

But how do we get our pictures if we can’t locate the box? Or we can locate the box but it’s guarded by vicious foxes (like this one)?

The process we need is as follows:

  • Camera takes pictures
  • Trigger point(s) reached – e.g. box is on the ground so altitude stops changing, or a certain amount of time passes from launch.
  • Pi connects to camera (workaround because camera can’t take pictures and connect to Pi at the same time)
  • Pi finds images on camera copies to SD card.
  • Pi sends images over internet via 3G, from most recent image backwards (to help with retrieval).
  • Ground station accesses images.

Dan and Will will be working on this system over the next few weeks: getting images off the GoPro (maybe a wired connection beween the GoPro and the RPi that switches the GoPro into drive mode at the trigger point); getting a 3G dongle to work with the RPi; finally syncing the images from the RPi to a cloud service or pushing them out via email over 3G.

The Test Build – 17th May 2015

image

Men x4
Women x2 (not pictured as off doing useful and important things)
Laptops  x5
RPi x4
Bespoke aerial x1
Wibbly radio noises x1
Router x1
RPi hubs x3
GoPro x1
Foam blocks x5
Polystyrene boxes x2
I could go on…

The first range test along the Crescent was successful, until the RPi fell over. But still transmitters were transmitting and receivers were receiving so we rebooted the RPi et voila! Data moving from box to laptop!

The UBlox GPS module

The UBlox NEO6MV2 connects to the Raspberry pi via serial port. Since the Transmitter will be using the Pi’s built in serial port, the GPS will connect to the Pi via a USB serial port. (even though it would be possible to have the transmitter use the Pi’s built in serial tx(transmitter) and the GPS use the Pi’s built in serial rx(reciever), it would be difficult to have the tx and rx working shared in this way as they are each working at different baud rates – also, it is useful to be able to change the default settings of the GPS module which requires use of a tx port on the Pi).

The UBlox GPS module, like most GPS modules, will only work if it pretty much has line of sight to a satellite (what that means is it has to be outside, the inch or so of polystyrene of our payload box should not effect the GPS signal however). When it is outside, by default it recieves lots of data from one or more satellites about position, time, date and a lot of other stuff that most people don’t care about – eg.information about the satellites that the GPS is in contact with.

For our project we have configured the UBlox GPS module to recieve nothing by default, and instead just request a sentence containing the latitude, longitude, altitude, speed, heading and time, whenever the Raspberry Pi asks it to.

If you are interested in seeing the details of how the software for controlling the GPS module works, see here.

Meeting Roundup: Physical Test Build

Today’s session was focused around a physical test build. This basically meant a hands on session, where we cut foam with breadknives and packed bags with shredded paper to simulate the physical payload. We had the coloured markers, sticky tape, stanley knives out…mocking up all the components we didn’t actually have. I don’t know that I should say that it was a little bit like going back to playschool, but it was…but for adult geeks.

It was really good to start seeing how components related to each other and start looking at the realities of where items would need to be placed, structured and packed. It also showed us how far along we really were and just how much space we might have available for the science experiments.

Working with physical objects instigated lots of useful discussion, including how we will secure components (current thinking is Velcro), what needs to be insulated or sealed and the structure (in layers). We have decided to keep the radio transmitter aerial inside the box to protect it as it may be required to locate the payload. We intend to run location tests (‘foxhunts’) so if this proves a problem, then we will look at fixing the aerial externally.

It’s not until you move out of the realm of theory and into physical reality, that the ‘interesting’ challenges start arising. For instance, we wanted to test how we mounted the camera into the side of the box (considering field of vision, structural integrity and also minimizing risk of damage). Aside from noting we’ll need to accommodate the cable, pick a suitable material and sealant for mounting the camera, we took our first images to ensure the box wouldn’t be visible in the images. Here is our first image…it’s Will!

Hello Will!
DCIM100GOPRO

However, all our test images came out dark (underexposed), like this one (we tried one without a bright light source in it that might trick the sensor)…

DCIM100GOPRO

This was something that none of us had anticipated. We soon realized that the exposure sensors were obscured…something that we’ll need to take into consideration for our camera mount design and next test build. You can prototype all you like, but it’s not until you start building that the real challenges arise.

We also debated how the box should be attached to the balloon. With cords secured through the box (with the risk that the cord might cut into the polystyrene, but almost zero risk that the box would become detached from the parachute) or the box contained within a net bag (higher risk of becoming unbalanced). We decided to run with the first option, with the net bag as our fallback position if the internal cords proved problematic during any future tests.

Most of all, this exercise showed to us how far along we already were. Special thanks to Will, who is now working out of Bournemouth. Partly due to train delays, he spent 9+ hours in transit to make it to this session. Now that’s dedication!

With the technical build fully on track, our next stage will be to focus on the science experiments.

Using the Raspberry Pi to get photos from the GoPro camera

Inside of our payload, we will have a GoPro camera recording HD video and taking HD photos throughout the flight. When the payload lands back on earth, we want to stop the recording and if we can obtain an internet connection – upload as many photos as possible in case we do not retrieve the payload.

The problem:

The Pi cannot control the GoPro directly, eg. use it as a webcam. It can only read and copy images and video from the GoPro when the GoPro is connected via USB and in its “plugged into PC mode” – during which it will not take photos or record video.

The solution:

To work around this problem, we decided to have the Pi connected to the GoPro via a modified USB lead. The idea is to cut the ground wire of the USB lead, and connect the two severed ends via a transistor – which acts as a switch controlled by the Raspberry Pi.

Throughout the flight, the switch will be open, so that even though the camera and Pi are connected via USB, the circuit will have a break in it. The camera will not know it is connected to the Pi, and so will be in recording/photo mode(we will start the recording ourselves before the launch). When our sensors tell us that we are back on earth, the Pi will close the switch by applying a current to the transistor base pin, and the camera will go into “plugged into PC mode” – perfect!

The only problem is that once the switch is closed, the camera will not continue what it was doing once it is opened again, however – just closing the switch once is fine for our needs.

The circuit and the code:

This is the circuit we are using:

Circuit to switch on and off the connection between the Raspberry Pi and a camera
Circuit to switch on the connection between the Raspberry Pi and a camera whenever you wish.

The transistor used is a BD437 NPN transistor, but any NPN transistor which will allow 500mA @ 5V to flow from collector to emitter will work.

It is very simple to make the circuit:

Simply cut the ground wire(black) of a USB lead. Connect any of the Pi’s GPIO pins to the transistor base, connect the emitter to the USB side ground wire, and the collector to the mini USB side ground wire.

Here is an example of code to control this circuit using the Pi, in Java using Pi4J library:

https://github.com/will093/pi-transistor-switch

Design a site like this with WordPress.com
Get started