Tuesday, October 19, 2010

INPUTS AND OUTPUTS



A complete version of the inputs vs outputs for the interactive bus stop.

Interview with a person who is blind

As a group we interviewed a blind man waiting for a bus, and asked him his opinion about some of the main points of interaction with the bus stop. He actually thought sliding your finger down the bus stop to find where the bus is in relation to the bus stop was arbitrary, and believed that an audio message might be more appropriate. He also told us typically what happens to him when he is waiting at a large bus stop with multiple busses passing through. Usually each driver will stop and ask what bus he needs to take. Because of this, he said, an audio message should communicate what bus is entering at any time.

He also raised an issue we hadn’t considered earlier which is how blind people know what stop they need to get off at when they are already on the bus. This could be solved with another audio message when the passenger in on the bus. The system would already know what stop he needs to get off at. These audio messages will only sound when the ‘priority seating’ button is pressed.

Flowchart



This is a finalised version of a flow diagram that explains the sequence of events if either push button is pressed. Basically if Push Button One is pressed, the left row of LED's will go through a basic sequence to show a typical bus coming down a route and stopping at the bus stop. If the second Push Button is pressed the right row of LED’s go through a simular process to the left row aside from getting delayed half-way down the route.

The Diagram shows white circles next to the LED’s which are light sensors. We decided to change from a membrane potentiometer to light sensors, for quite a few reasons including price, accuracy and manufacturing ability. Their will now be an extra layer of MDF sandwiched between the acrylic layers which has squares removed only around the groups of LED’s/Light sensors. This will form a box around the light sensor so when a finger is placed over the tiny box (~8mmx8mm) all light is omitted and the sensor will produce a signal.

This signal will be used to power a vibrating motor, so people that are blind can feel down the route and know where the bus is in relation to their stop.

Tuesday, October 12, 2010

Week 12 Tutorial

Drawing inspiration from the lecture Yasu gave about distributed, locative and sociable media, Patrick and I came up with the final mode of interaction between the passengers waiting at the bus stop and the passengers already on the bus. When we received feedback from the tutors that we needed more interaction between these two groups of users we automatically started thinking about a sociable communication method, which we had a lot of trouble coming up with. So we have now come up with a final interaction which would fall under locative media.

The input is a button located on the bus stop that can be pressed by passengers waiting to get on the bus who require to sit in the priority seating area. The button will activate a ‘priority seating light’ on the bus, adjacent to the existing ‘bus stopping light’, which will notify passengers on the bus that a passenger requires to sit in the priority seating area, and when the bus comes to a stop at the next stop, able passengers currently sitting in these areas should vacate to another seat or stand in the aisle.

With this final interaction in place, we set out to draw a flow diagram of all the interactions between all inputs and outputs. We found this to be quite a challenging process, as we had to tie up and confirm every last detail in all network interactions. Tim came along to give us some feedback at this stage. He agreed on the final interaction we had just come up with and liked the broadness of our network. He did however express concerns regarding the manufacturing of the bus stop. We originally decided on a welded aluminum U-Bar construction, which Tim talked us out of due the expensive nature of this manufacturing technique.

So we have now changed the look of the bus stop slightly as it now will have a flat front face and a curved Polycarbonate rear. (For more information see Patrick’s blog as he is covering object).

Thursday, October 7, 2010

Approaching the design freeze



We have reached a point were we are very close to a design freeze. Patrick has begun work on making a working prototype of the model, we found that he and I need to be working together quite frequently at this stage of construction as the electronics will be supplied by me. On that note I have finalised the inputs and outputs for the behavior of the bus stop. The two inputs are firstly: the buttons located on every stop along the route, so the user can select their destination and also a slider sensor for blind people, which I will explain in a few sentences.

The two outputs are the lines of LED’s along the bus route that show where the bus is in relation to the stop and also a vibrating motor which is activated by the slider sensor. Ideally the lines that represent the bus routes would all be slider sensors, so as you slide your finger along the route the motor will send a vibration through the whole bus stop informing a sight impaired person where the bus it along the route. But due to construction limitations we will place the slider sensor adjacent to the bus routes.

The next step, from a behavior point of view, is to communicate with Yasu regarding the electronics and the best way of approaching the coding.