Why use a distributor?

Don,

My son ask Kenny D. about this subject this weekend at the nats because of this post. He pointed to the two cars he was working on and said both made the same power on the dyno. One had a distributor one did not. Now these cars were running race gas and not methanol.

Jim
 
From what we've covered, I have to conclude that lower egts found after switching to a distributor is caused soley by the improved spark delivered by the msd box. Not the distributor. Any advanced spark timing can play a small role in lowering egt too. A quicker more complete burn due to a better spark would also increase hp, lower fuel requirements and would be seen as a leaner O2 reading.

I dont think that is soley the correct conclusion. If this improved spark caused more of the A/F to be burned in the cylinder this would read as adecrease of EGT in pipe where the probe is placed, BUT at the same time, if more fuel is burned in the cylinder it releases more heat energy so that can be read as an increase in EGT temps at the same time. If more complete burn is occuring in the chamber then less ignition timing is required. This complicates the conclusions even more.

I dont think its as cut and dry as is made out.

Where and when the mixture is burned, how much of it left over from incomplete combustion, probe placement, all factor into the equation of EGT's.

In a forced induction application more heat is gerated in the same given engine size compared to a NA aspirated
application. Thus the extra fuel (richer) A/F required to keep temps reasonable to not melt pistons etc. Because of these richer A/F's its possible that EGT's are effected more. You will always have more fuel left over at an 11.5:1 O2 reading used in forced induction than say 12.8:1 NA no matter what the spark does. O2 is not the same as EGT.
 
Don,

My son ask Kenny D. about this subject this weekend at the nats because of this post. He pointed to the two cars he was working on and said both made the same power on the dyno. One had a distributor one did not. Now these cars were running race gas and not methanol.

Jim

If both systems are powerful enough to adequately start the combustion process, that is exactly what I would expect. A spark is a spark is a spark.

If, for whatever reason, there is ignition missing occuring with one of the systems, you would, of course, expect lower hp.
 
I dont think that is soley the correct conclusion. If this improved spark caused more of the A/F to be burned in the cylinder this would read as adecrease of EGT in pipe where the probe is placed, BUT at the same time, if more fuel is burned in the cylinder it releases more heat energy so that can be read as an increase in EGT temps at the same time. If more complete burn is occuring in the chamber then less ignition timing is required. This complicates the conclusions even more.

I dont think its as cut and dry as is made out.

Where and when the mixture is burned, how much of it left over from incomplete combustion, probe placement, all factor into the equation of EGT's.

In a forced induction application more heat is gerated in the same given engine size compared to a NA aspirated
application. Thus the extra fuel (richer) A/F required to keep temps reasonable to not melt pistons etc. Because of these richer A/F's its possible that EGT's are effected more. You will always have more fuel left over at an 11.5:1 O2 reading used in forced induction than say 12.8:1 NA no matter what the spark does. O2 is not the same as EGT.


I agree. I was only trying to put down something, that could be used to build upon.

Now since I don't have a bunch of experience power tuning with gasoline, let me ask. When EGTs start to get out of line, what is it that is mostly done to get temps back under control? With alcohol, you'd want to make sure timing is not retarded first. With that being OK, the next would be fueling. Not taking any aftercooling changes into consideration, extra fuel would be added to be used as a combustion coolant. This would lower initial charge temp, combustion temp and ultimately exhaust temp. Is it any different with gasoline?
 
I dont think that is soley the correct conclusion. If this improved spark caused more of the A/F to be burned in the cylinder this would read as adecrease of EGT in pipe where the probe is placed, BUT at the same time, if more fuel is burned in the cylinder it releases more heat energy so that can be read as an increase in EGT temps at the same time. If more complete burn is occuring in the chamber then less ignition timing is required. This complicates the conclusions even more.

I dont think its as cut and dry as is made out.

Where and when the mixture is burned, how much of it left over from incomplete combustion, probe placement, all factor into the equation of EGT's.

The completeness of the burn doesn't factor into the need to change ignition timing as much as how fast the burn is. A lot of things will dictate the speed of the burn. The faster the burn, the less ignition timing needed. Ignition timing is more of the placement of peak cylinder pressure from combustion in relation to crankshaft position and getting the most push from the pressure transferred to the crank to end up with the most work expelled from the combustion cycle. I believe the target to obtain is peak cylinder pressure between 12 to 17 degrees ATDC.
 
Not really that much IMOP. Except that alcohol cools more than gas and may require less A/F change to provide similar cooling. Other things that can be done to lower EGT's in forced induction app's is to reduce BP. IE: Larger DP's more efficient turbines and tubine wheels etc etc. This however is getting off the original subject a bit.

The question should revert back more to, is the waste spark causing a rise in EGT's?
 
Don,

My son ask Kenny D. about this subject this weekend at the nats because of this post. He pointed to the two cars he was working on and said both made the same power on the dyno. One had a distributor one did not. Now these cars were running race gas and not methanol.

Jim

Good information. Thanks to your son. And for that matter, to the rest of you for contributing.
 
Not really that much IMOP. Except that alcohol cools more than gas and may require less A/F change to provide similar cooling. Other things that can be done to lower EGT's in forced induction app's is to reduce BP. IE: Larger DP's more efficient turbines and tubine wheels etc etc. This however is getting off the original subject a bit.

The question should revert back more to, is the waste spark causing a rise in EGT's?

Right. I think that is the last lingering question.
 
I've been going through my personal library trying to find something that would suggest that the waste spark presents a problem during the exhaust firing. Nothing is mentioned on the subject except;

A quote from the Bosch Automotive Handbook. "As there is an additional spark during the exhaust stroke, it is important to ensure that residual mixture or fresh mixture is not ignited."
This would suggest to me that it is not considered normal operation for this to be occurring.

Some other interesting facts I was able to dig up:

Electromotive holds several important patents regarding DIS operation and coil charging.

When GM and Ford started going with DIS systems, they turned to Electromotive's patented technology for their technique in charging multiple inductive coils. Now, these factory systems are licensed from Electromotive, though the factory doesn't use them to the capacity they're capable of delivering. That's quite allright with Electromotive. That allows Electromotive to continue offering it's high performance DIS systems.
In other words, the factory DIS systems are watered down versions. If you want the real deal, Electromotive.

Electromotive's ignition systems are inductive designs. Since there is one coil for 2 cylinders the charge time can be 4 times that of a distributor-triggered ignition. That means full-output sparks are available to well over racing rpms.

One of the biggest benefits of an inductive system is spark duration. Especially with the lengthy coil charging times between firings.

There is some interesting information on CDI (Capacitor-discharge ignition) in the Bosch Handbook. I'll pass a summary along after I've finished studying it.
 
You may have a point a good quality DIS system is proabaly just as good as a distributor, but the distributor is the cheapest way to go and will provide more "reliable" results than the oem coil packs and modules, being failure prone.
 
I've been doing some quick research on different ignition systems. There are basically 2 types of coil charging strategies being used in high performance ignitions. Inductive and capacitor discharge. The properties of the spark at the plug are different with each. The CD system generates a powerful voltage and currect that's capable of jumping the spark plug gap even with shunting present in the secondary system. This would include spark plug fouling. The duration of the spark though, is very short. Hence the need to provide multiple spark events in an effort to assure that the a/f charge is ignited, and to try to match the spark event duration of an inductive system. Also with the CD system, the multiple sparking is changed to only one spark event at a relatively early rpm, leaving only one, short spark occurrence in the high load, high rpm regions of the engines operating rpm range. The power of the spark with a CD system remains constant throughout the rpm operating range.

A quote from the Bosch Automotive Handbook pertaining to capacitor-discharge ignition (CDI): "For many applications, the spark duration of 0.1...0.3 ms is too brief to ensure that the air-fuel mixture will ignite reliably. Thus CDI is only designed for specific types of engine, and today its use is restricted to a limited application range, as transistorized ignition systems now afford virtually the same performance. CDI is not suited for aftermarket installations."

There is some variety in configurations and effectiveness with the inductive ignition types. That's next.
 
You may have a point a good quality DIS system is proabaly just as good as a distributor, but the distributor is the cheapest way to go and will provide more "reliable" results than the oem coil packs and modules, being failure prone.

I agree. The distributor is the cheapest route.
 
I'm surprised. No one has an issue with the Bosch quote on CDI systems? I was a little taken back by it.
 
I came across some interesting info. A 16 volt system doesn't improve the output of a capacitive discharge ignition control. Most CD ignitions will clamp down their supply voltage to fit their needs and let the transformer step up the voltage. However a CD ignition will suffer greatly if voltage drops below 11 volts or so.

Most inductive systems on the other hand will benefit greatly from the increased voltage supply of a 16V system. If it's true that the stock system clamps down the primary voltage supply, then this would not be true for the 'stock' system.

The extra capacity of a 16V system is particularly important on race cars that don't run an alternator.

Precautions should be observed in the pits when charging a 16V system.
 
It makes sense that it could help the factory setup, because it rapidly runs out of time to adequately charge the ignition coil(s) as rpm rises. A direct way to help address that problem is to raise the coil primary charge voltage from 12v to 16v. This will get more current and energy into the coil primaries in the same limited amount of time since di/dt = V/L. Raise V and di/dt raises as well.

FAST sent us a custom programmed ECU once for a turbo Nissan project. One thing that got messed up in the process was the EST dwell time, since this car was (unexpectedly to FAST) still using the stock inductive ignition rather than a CD ignition. Checking with a scope, the dwell time pulse width wound up being only about 1ms, and the car ran pretty bad generally, and not at all under a load (no surprise). This is exactly analogous to the problem of running out of available dwell time with increasing rpm on the factory setup.

TurboTR
 
Top