Class.Assignment1And3PeakFindingAndDatalogging History

Hide minor edits - Show changes to output

Changed lines 235-238 from:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interesting alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention!! Following is the digital pic code for the anenometer sending binary data through the serial port to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information to different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max.It's also an interesting way to combine readouts from various sensors at once an to see how their patterns relate to each other,like light, wind and sound patterns manipulating something all at once...Here are some photographs of the process and the digital pic code that I used.(it also lit an led on the board so that I could see what I was getting.)
to:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interesting alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention!! Following is the digital pic code for the anenometer sending binary data through the serial port to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information to different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max.It's also an interesting way to combine readouts from various sensors at a given time and place and to see how their patterns relate to each other (like light, wind and sound patterns) by manipulating something all at once...Here are some photographs of the process and the digital pic code that I used.(it also lit an led on the board so that I could see what I was getting.)
Changed lines 235-238 from:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interesting alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention!! Following is the digital pic code for the anenometer sending binary data through the serial port to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information to different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max.It's also an interesting way to combine readouts from various sensors at once an to see hoe their patterns relate to each other...Here are some photographs of the process and the digital pic code that I used.(it also lit an led on the board so that I could see what I was getting.)
to:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interesting alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention!! Following is the digital pic code for the anenometer sending binary data through the serial port to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information to different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max.It's also an interesting way to combine readouts from various sensors at once an to see how their patterns relate to each other,like light, wind and sound patterns manipulating something all at once...Here are some photographs of the process and the digital pic code that I used.(it also lit an led on the board so that I could see what I was getting.)
Changed lines 154-155 from:
Once the Processing was running with Serial into it (the values in the Pic code were bracketed in order to make the readings in processing visually manageable), then I felt brave enough to attempt the peak finding with the photocell and wind sensor (which I would use furhter down the line in the datalogging assignment...)
to:
Once the Processing was running with Serial into it (the values in the Pic code were bracketed in order to make the readings in processing visually manageable), then I felt brave enough to attempt the peak finding with the photocell and wind sensor (which I would use further down the line in the datalogging assignment...)
Added lines 275-277:
!!!'''Max/MSP/Jitter Patch:'''
Added lines 242-244:

!!!'''Pic Code: Digital out (w/ LED).'''
Changed lines 240-241 from:
to:
http://itp.nyu.edu/~ras4425/sensor/sensorboard3.JPG
Changed lines 239-241 from:
http://itp.nyu.edu/~ras4425/sensor/sensorboard1.jpg
to:
http://itp.nyu.edu/~ras4425/sensor/sensorboard1.JPG
Changed lines 239-241 from:
http://itp.nyu.edu/~ras4425/sensor/.jpg
to:
http://itp.nyu.edu/~ras4425/sensor/sensorboard1.jpg
Changed lines 239-241 from:
http://itp.nyu.edu/~ras4425/sensor/sensorboard.jpg
to:
http://itp.nyu.edu/~ras4425/sensor/.jpg
Changed lines 237-239 from:
http://itp.nyu.edu/~ras4425/sensor/sensorboard.JPG
to:


http://itp.nyu.edu/~ras4425/sensor/sensorboard.jpg
Added lines 237-239:
http://itp.nyu.edu/~ras4425/sensor/sensorboard.JPG
Changed lines 235-236 from:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interested alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention! Following is the digital pic code for the anenometer sending seial data to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max...Here are some photographs of the process and the digital pic code that I used.(it also lit an led on the board so that I could see what I was getting.)
to:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interesting alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention!! Following is the digital pic code for the anenometer sending binary data through the serial port to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information to different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max.It's also an interesting way to combine readouts from various sensors at once an to see hoe their patterns relate to each other...Here are some photographs of the process and the digital pic code that I used.(it also lit an led on the board so that I could see what I was getting.)
Added lines 230-232:

!!'''Assignment 3: '''
Added lines 264-268:
http://itp.nyu.edu/~ras4425/sensor/jitterpatch911.JPG
http://itp.nyu.edu/~ras4425/sensor/jitterlights1.JPG
http://itp.nyu.edu/~ras4425/sensor/jitterjpeg1.JPG
http://itp.nyu.edu/~ras4425/sensor/jitterjpeg2.JPG
http://itp.nyu.edu/~ras4425/sensor/jitterjpeg3.JPG
Changed lines 232-233 from:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interested alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention! Following is the digital pic code for the anenometer sending seial data to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max...Here are some photographs of the process and the digital pic code that I used.
to:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interested alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention! Following is the digital pic code for the anenometer sending seial data to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max...Here are some photographs of the process and the digital pic code that I used.(it also lit an led on the board so that I could see what I was getting.)
Changed lines 232-263 from:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased under neath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How ncould the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a bynary output activated by the reed switch...so how could this pattern or rythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chaose to see how the patterns and rythms of the wind sensor could be experienced through sound and 3D video projection through the use of MAX/MSP/JITTER.
to:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased underneath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How could the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a binary output activated by the reed switch...so how could this pattern or rhythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chose to see how the patterns and rhythms of the wind sensor could be experienced through the pacing of sound and 3D video projection through the use of MAX/MSP/JITTER. The sound files could then be recorded and act as a record of the wind sensor's activity over any period of time...After many days of attempting to program with php to my MYSQL account this seemed like a viable and interested alternative. Although the sound files could also be uploaded to a server after they were completed. Necessity is the mother of invention! Following is the digital pic code for the anenometer sending seial data to the Max program. (The Max program takes video of light patterns generated by sunlight reflecting on a river and based on those light patterns, filters the velocity, pitch and duration of a sonic field recording of that immediate environment through comb filters. The wind sensor was used to create a pacing or rhythm of the sound and video as they played by attatching the serial information different swiches in the patch.) The eventual output could then be recorded and logged using "sfrecord~" and "jit.qt record" in Max...Here are some photographs of the process and the digital pic code that I used.

Based on code by Tom Igoe.

include "modedefs.bas"
define debug_REG PORTC
define debug_BIT 6
define debug_BAUD 9600
define debug_MODE 1

X var byte



input portb.0
output portd.1
tx var portc.6
n9600 con 16468

main:
if portB.0 = 1 then ' if the switch is closed on pin RB0
low portd.1 ' set pin RD1 low
X=1
else
high portd.1 ' set RD1 high
x=0
endif

serout2 tx, n9600, [X]
goto main
Changed line 232 from:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased under neath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging
to:
With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased under neath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging. What would I do with all those 1's and 0's? How ncould the speed of wind be translated and recorded over time in an interesting way that is not just another simple conversion program of rotions per second to speed/time? How can this information that the wind sensor gives us effect our sensory experience through space and sound? The wind sensor creates patterns of information over time in a bynary output activated by the reed switch...so how could this pattern or rythm generate some sensory experience of our environment besides just generating a huge text file? (Although a huge text file would be kind of interesting in 0's and 1's...you might be able to see variances in visual textures if the "drawing" was let's say printed on a 6' x6' piece of paper). But I chaose to see how the patterns and rythms of the wind sensor could be experienced through sound and 3D video projection through the use of MAX/MSP/JITTER.
Changed lines 198-199 from:
ADCON1 = %10000010\\
to:
ADCON1 = %10000010
Added lines 229-232:

!!!'''Datalogging with the wind speed sensor:'''

With this peak value detection code working well for the photocell, I then began to think of how I could hook up the wind sensor in order get some datalogging going. The wind sensor is activated by a small hermetically sealed reed switch encased under neath the rotor, thus the code had to be adjusted for digital output... but then there is a slight problem... there are no real peaks with the sensor because it just sends out pulses (1's and 0's)at different rates depending on the speed of the wind over any given period.This raised some interesting questions in terms of datalogging
Changed lines 176-199 from:
DEFINE ADC_BITS 10 ' Set number of bits in result
DEFINE ADC_CLOCK 3 ' Set clock source (3=rc)
DEFINE ADC_SAMPLEUS 20 ' Set sampling time in uS

PeakValue var word
SensorValue var word
LastSensorValue var word
Threshold var word
Noise var word

' serial pins and data reate:
tx var portc.6
rx var portc.7
n9600 con 16468

Threshold = 70 ' set your own value based on your sensors
PeakValue = 0 ' initialize peakValue
noise = 5 ' set a noise value based on your particular sensor

' Set PORTA to all input
TRISA = %11111111
' Set up ADCON1
ADCON1 = %10000010
to:
DEFINE ADC_BITS 10 ' Set number of bits in result\\
DEFINE ADC_CLOCK 3 ' Set clock source (3=rc)\\
DEFINE ADC_SAMPLEUS 20 ' Set sampling time in uS\\

PeakValue var word\\
SensorValue var word\\
LastSensorValue var word\\
Threshold var word\\
Noise var word\\

' serial pins and data reate:\\
tx var portc.6\\
rx var portc.7\\
n9600 con 16468\\

Threshold = 70 ' set your own value based on your sensors\\
PeakValue = 0 ' initialize peakValue\\
noise = 5 ' set a noise value based on your particular sensor\\

' Set PORTA to all input\\
TRISA = %11111111 \\
' Set up ADCON1\\
ADCON1 = %10000010\\
Added lines 168-170:

Added lines 172-228:

Based on code by Tom Igoe.

' Define ADCIN parameters
DEFINE ADC_BITS 10 ' Set number of bits in result
DEFINE ADC_CLOCK 3 ' Set clock source (3=rc)
DEFINE ADC_SAMPLEUS 20 ' Set sampling time in uS

PeakValue var word
SensorValue var word
LastSensorValue var word
Threshold var word
Noise var word

' serial pins and data reate:
tx var portc.6
rx var portc.7
n9600 con 16468

Threshold = 70 ' set your own value based on your sensors
PeakValue = 0 ' initialize peakValue
noise = 5 ' set a noise value based on your particular sensor

' Set PORTA to all input
TRISA = %11111111
' Set up ADCON1
ADCON1 = %10000010

Main:
' read sensor on pin RA0:
ADCin 0, sensorValue
'serout2 tx, n9600, ["sensor value", SensorVAlue]
's above the threshold:
If sensorValue >= threshold + noise then
' if it's greater than the last reading,
' then make it our current peak:

If sensorValue >= lastSensorValue + Noise then
PeakValue = sensorValue
' serout2 tx, n9600, ["peak reading1", peakValue]
endif
' if the sensorValue is not above the threshold,
' then the last peak value we got would be the actual peak:
Else
If peakValue >= threshold then
' this is the final peak value; take action
serout2 tx, n9600, [peakValue]
endif

' reset peakValue, since we've finished with this peak:
peakValue = 0
Endif

' store the current sensor value for the next loop:
lastSensorValue = sensorValue
pause 20
Goto main
Changed line 158 from:
Simple analog in...\\
to:
Simple analog in...with mesh\\
Changed line 161 from:
With peak detection...Mesh would only change if peak value was detected(with wire mesh and planes)\\
to:
With peak detection...Mesh would only change if peak value was detected(with planes and wire mesh)\\
Added lines 166-168:


!!!'''Peak detection code for Pic:'''
Deleted lines 157-158:

http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wire.jpg
Changed lines 159-161 from:
to:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wire.jpg

With peak detection...Mesh would only change if peak value was detected(with wire mesh and planes)\\
Changed lines 163-164 from:
with peak detection...Mesh would only change if peak value was detected...\\
to:
Changed line 162 from:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1peak.jpg
to:
http://itp.nyu.edu/~ras4425/sensor/schaum1peak.jpg
Changed lines 160-166 from:
to:
Simple analog in...\\

http://itp.nyu.edu/~ras4425/sensor/schaumadcin1peak.jpg
with peak detection...Mesh would only change if peak value was detected...\\

http://itp.nyu.edu/~ras4425/sensor/schaumpeak2.jpg
http://itp.nyu.edu/~ras4425/sensor/schaumpeak3wire.jpg
Changed lines 159-160 from:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wirecopy.jpg
to:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wire.jpg
Changed lines 159-160 from:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wire copy.jpg
to:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wirecopy.jpg
Changed lines 159-160 from:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wirecopy.JPG
to:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wire copy.jpg
Changed lines 159-160 from:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wire copy.JPG
to:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wirecopy.JPG
Changed lines 159-160 from:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin-1wirecopy.JPG
to:
http://itp.nyu.edu/~ras4425/sensor/schaumadcin1wire copy.JPG
Added lines 158-160:

http://itp.nyu.edu/~ras4425/sensor/schaumadcin-1wirecopy.JPG
Changed lines 12-13 from:
!!!'''Processing code:'''
to:
!!!'''Processing code: 3d DataMesh and Planes.'''
Changed lines 128-129 from:
!!!'''Pic Code:'''
to:
!!!'''Pic Code: Simple analog in.'''
Changed lines 152-153 from:
goto main\\
to:
goto main
Changed lines 128-129 from:
!!!'''Pic Code'''
to:
!!!'''Pic Code:'''
Changed lines 154-157 from:
Once the Processing was running with srtaight Serial into it, th
to:
Once the Processing was running with Serial into it (the values in the Pic code were bracketed in order to make the readings in processing visually manageable), then I felt brave enough to attempt the peak finding with the photocell and wind sensor (which I would use furhter down the line in the datalogging assignment...)

Here are some visuals followed by the peak finding code I used with the Pic. The 3D graph displaces sequentially at every vertex point based on the readings from the sensor, thus creating a type of landscape of data after every cycle throgh the mesh.\\
Changed lines 12-13 from:
to:
!!!'''Processing code:'''
Changed lines 131-153 from:
DEFINE ADC_BITS 10 ' Set number of bits in result
DEFINE ADC_CLOCK 3 ' Set clock source (3=rc)
DEFINE ADC_SAMPLEUS 50 ' Set sampling time in uS

TopVal con 300
ADCvar VAR WORD ' Create variable to store result
SendMeVar VAR byte

TRISA = %11111111 ' Set PORTA to all input
ADCON1 = %10000010 ' Set PORTA analog and right justify result

Pause 250 ' Wait .5 second

main:
ADCIN 0, ADCvar ' Read channel 0 to adval
if ADCvar > 300 then
ADCvar = TopVal
endif
SendmeVar = ADCvar*255/TopVal
serout2 PORTC.6, 16468, [SendMeVar] ' print it to serial out,
pause 20
goto main
to:
DEFINE ADC_BITS 10 ' Set number of bits in result\\
DEFINE ADC_CLOCK 3 ' Set clock source (3=rc)\\
DEFINE ADC_SAMPLEUS 50 ' Set sampling time in uS\\

TopVal con 300\\
ADCvar VAR WORD ' Create variable to store result\\
SendMeVar VAR byte\\

TRISA = %11111111 ' Set PORTA to all input\\
ADCON1 = %10000010 ' Set PORTA analog and right justify result\\

Pause 250 ' Wait .5 second\\

main: \\
ADCIN 0, ADCvar ' Read channel 0 to adval\\
if ADCvar > 300 then\\
ADCvar = TopVal\\
endif\\
SendmeVar = ADCvar*255/TopVal\\
serout2 PORTC.6, 16468, [SendMeVar] ' print it to serial out,\\
pause 20\\
goto main\\
Changed lines 128-154 from:
to:
!!!'''Pic Code'''

' Define ADCIN parameters\\
DEFINE ADC_BITS 10 ' Set number of bits in result
DEFINE ADC_CLOCK 3 ' Set clock source (3=rc)
DEFINE ADC_SAMPLEUS 50 ' Set sampling time in uS

TopVal con 300
ADCvar VAR WORD ' Create variable to store result
SendMeVar VAR byte

TRISA = %11111111 ' Set PORTA to all input
ADCON1 = %10000010 ' Set PORTA analog and right justify result

Pause 250 ' Wait .5 second

main:
ADCIN 0, ADCvar ' Read channel 0 to adval
if ADCvar > 300 then
ADCvar = TopVal
endif
SendmeVar = ADCvar*255/TopVal
serout2 PORTC.6, 16468, [SendMeVar] ' print it to serial out,
pause 20
goto main

Once the Processing was running with srtaight Serial into it, th
Changed lines 14-18 from:
int maxCols = 20;
int maxRows = 20;
int xcount = 0;
int ycount = 0;
int [][] yvalues = new int [20][20];
to:
int maxCols = 20;\\
int maxRows = 20;\\
int xcount = 0;\\
int ycount = 0;\\
int [][] yvalues = new int [20][20];\\
Changed lines 21-22 from:
import processing.serial.*;
Serial port;
to:
import processing.serial.*;\\
Serial port;\\
Changed lines 10-13 from:
Once the range of values were generated I then proceeded to integrate it with my Processing code as follows:

\\
to:
Once the range of values were generated I then proceeded to integrate it with my Processing code as follows:
Changed lines 10-11 from:
Once the range of values were generated I then proceeded to integrate it with my Processing code as follows:\\
to:
Once the range of values were generated I then proceeded to integrate it with my Processing code as follows:

\\
Changed lines 9-10 from:
-> In the first assignment I used a simple photocell to generate distortions of a 20x20 3D mesh created in processing
to:
In the first assignment I used a simple photocell to generate distortions of a 20x20 3D mesh created in Processing. At first, all I wanted to accomplish was to simply get readings from the photocell in a range I could use. The initial set up was to construct a breadboard with a photocell on porta.0 and read the values generated in HyperTerm. My ranges varied greatly with the use of different resistors in line with the photocell. I eventually decided on using a 10K resistor... It gave the best readings (range 0-~550 in normal indoor light).
Once the range of values were generated I then proceeded to integrate it with my Processing code as follows:\\

int maxCols = 20;
int maxRows = 20;
int xcount = 0;
int ycount = 0;
int [][] yvalues = new int [20][20];
int val;

import processing.serial.*;
Serial port;
//int serial = val;

void setup(){

println (Serial.list());
port = new Serial (this, Serial.list()[0], 9600);

size (600,600,P3D);
for ( int i = 0; i<20; i++){
for ( int j = 0; j<20; j++){
yvalues[i][j] = 0;

}
}
}

void draw(){


//smooth();
//framerate(5);
background(255);
strokeWeight(4);
yvalues[xcount][ycount] = val/10;
xcount++;

if (xcount == maxCols) {
ycount++;
xcount = 0;
if (ycount == maxRows) {
ycount = 0;
}
}

//FILL ALL THE RANDOM NUMBERS
//for ( int i = 0; i < 20; i++){
//for (int j = 0; j < 20; j++){
//yvalues[i][j] = int (random (-10,10));



//GO TO A STARTING POINT
translate(300,300,100); //rotation box
rotateX(radians(mouseX));
rotateZ(radians(mouseY));
stroke(150);
noFill();
box (250);

//GO AND DO "ROWS"
pushMatrix();
translate(-100,-100);
stroke(55,130,250);


for (int j = 0; j < maxRows; j++) {
pushMatrix();
translate(0,j*10);
rotateX(radians(90));
for (int i = 0; i < maxCols-1; i++){
int x = i*10;
line (x, -yvalues[j][i], x+10, -yvalues[j][i+1]);
//rectMode(CORNERS);
//fill (102,200,150,155);
//rect(x, -yvalues[j][i], x+10, -yvalues[j][i+1]);
}
popMatrix();
}
popMatrix();

//GO AND DO "COLS"
pushMatrix();
translate(-100,-100);
stroke(55,170,100);

for (int i = 0; i < maxCols; i++) {
pushMatrix();
translate(i*10,0);
rotateY(radians(90));
for (int j = 0; j < maxRows-1; j++){
int y = j*10;
//line (x, -yvalues[j][i], x+10, -yvalues[j][i+1]);
line (yvalues[j][i], y, yvalues[j+1][i],y+10);
// rectMode(CORNERS);
//fill (102,200,150,155);
//rect(yvalues[j][i], y, yvalues[j+1][i],y+10);
}
popMatrix();

}
popMatrix();

}

void serialEvent(Serial port){
if (port.available() > 0) {
val = port.read();
println(val);
//val = 0
}
}
void mousePressed(){
}

Changed lines 5-7 from:
This assignment was very interesting! It took a while at the beginning to get used to
the
new Processing91 environment there are some new formatting rules for integrating serial
information, but following the initial struggles success was made.
to:
-> This assignment was very interesting! \\
It took a while at the beginning to get used tothe new Processing91 environment. There are some new formatting rules for integrating serial information, but following the initial struggles success was made.
Changed lines 9-11 from:
In the first assignment I used a simple photocell to generate distortions of a 20x20 3D mesh
created in processing
to:
-> In the first assignment I used a simple photocell to generate distortions of a 20x20 3D mesh created in processing
Added lines 1-11:
!!'''Assignment 1: Serial into Processing with peak detection'''
\\


This assignment was very interesting! It took a while at the beginning to get used to
the new Processing91 environment there are some new formatting rules for integrating serial
information, but following the initial struggles success was made.
\\
In the first assignment I used a simple photocell to generate distortions of a 20x20 3D mesh
created in processing