I can't tell you about F1 but I will tell you about my own experience.
I spent 5mths setting up an experiment to develop an engine map. I had a 55O PIII processor with 256Mb Ram and 2x20Gig HD just to monitor the engine successfully under 5500rpm. Over that the PC was just too underpowered.
I So why do I need so much power, the PC was connected to an Audi TT engine. I had to design a system that collects data from the engine while it runs. I needed that much RAM because the engine speed is about 6000rpm which is 100 revolutions per second. Now for every 30° that the crankshaft makes I had to record 11 readings occupying 89bytes. So every second the PC collects 89 x 360/30 x 100 x 11 = 106.8 kB/s (not too much). Then the PC has to manipulate the data in all sorts of feedback loops and send it back to the engine management system. The system stores everything, including the calculations and the data sent out. The net result is that the demand on the RAM just for transfering data is about 60Mb per second and the storage space of all the data over 1hr is up to 18Gb. The 256Mb of Ram are just to make sure the PC doesn't slow down because the system right now might crash if as little as 100° of crankshaft rotation were not measured. The reason is beyond the scope of this text :)Still, 256Mb proved to be insufficient. The project was supoosed to work towards a system whereby the whole system will be compacted into a couple of microchips and probably a 600Mhz processor and just gets bolted onto an engine. This will enable the engine to create its own engine maps when required so that the performance will always be at its peak.
In F1 obviously two way telemetry is not allowed. My system had only 11 sensors, but I would imagine that an F1 car would have more than tenfold that number. And at 18000rpm that would add up to thirty times the amount of data per second. If I assume that they use the same techniques then thats about 3Mb per second. Over a 90second lap thats 270Mb. My number is high because it is very obvious that my data aquisition process was very inefficient, otherwise F1 cars would need Pentium8 processors. The guys who took over the experiment from me said that they managed to do a trick with 2PCs and 3Processors. They also added a couple more sensors.
My 11 sensors were (lets see if I can remember
![:)](https://bb2.autosport.com/public/style_emoticons/default/smile.gif)
)
1.)Intake pressure
2.)Intake temperature
3.)exhaust pressure
4.)exhaust temperature
5.)Crank angle
6.)cylinder vapor conductivity
7.)intake air torque (vorticity)
8.)inteke flow rate
9.)outlet flow rate
10.)
11.)
10 and 11 were acoustic sensors, I do not exactly remember where they were located because my project involve them. I think they were used to measure engine vibration or something.
I was introduced to the basics of the engine dynamics and I ws surprised to find how critical seemingly small things were in an engine. For example the relationship between the vorticity and the inlet Pressure/temperature directly affected how much air you could squeeze into the engine, and after that, how well that air would burn. The relationship was shown on severaya 3D diagrams which should actually be 4D if our human mind could comprehend that.
I would expect all teams to scramble their telemetry, even though with 100+ variables it would still be hard to tell whats what if you did get the data.