It seems that the general consensus here is that compressing the air on the compression stroke represents a net energy loss. This isn't entirely true. At the end of the compression stroke, if no fuel is added, the air compressed within the combustion chamber will expand again and drive the piston back down, think of the air as a spring. The pressure in the cylinder will then be much the same as at the end of the intake stroke. Very little work is done and therefore the losses are minimal. The losses would consist of the heat lost during compression the some energy lost to moving the air through the intake and exhaust valves.
I respectfully dissagree. Yes it would seem that if you compress the cylinder that takes work, but the highly compressed air would then push the piston back down releasing the same amount of work. This is perpetual motion, which can't exist.
I believe the practical illustration of just how big the losses in energy due to compression can be illustrated as follows: Take an engine with good compression and crank it over by hand - or on the starter. To spin it at starter motor speeds takes in the order of 3HP just to crank it, that is nowhere near idling speed! Now adjust the exhaust valves so they are cracked open a bit all the time (don't foul the pistons!) and thus the engine has no compression. The cylinder still pumps the same volume of air - so pumping losses are still the same, but doesn't have the compression and similarly doesn't have the 'reclaimed' power on the downstroke after compression. It's quite remarkable, the starter spins the engine over twice as fast using half the current - sounds like a turbine! This is why a hand crank engine has a decompression lever, to allow you to spin the engine up to cranking speed without the considerable power looses of the compression cycle (same amount of air is always pumped, even on decompression lever operation as that lever allows air in and out of the cylinder freely).
They theory behind *why* I would guess as follows: A fair bit of energy upon compression is lost as heat and escapes through the engine. Some of the compression escapes as blow by - the higher the compreassion the higher proportion. On the down stroke the hot air from compression has had time to cool some and by the time the pistion gets to the maximum torque half way position the cylinder pressure may be half what it was on the upstroke at the same point. Increased compression increases friction on the engine by placing greater loads on it. There are also inertia losses from speeding up the air more and trying to push it into the cylinder head, where it 'piles' up, requiring a change of inertia and direction on the down stroke - all at the expense of power. I guess there are frictional losses in the air as it's forced into the pre-cup that manifest as heat, a proportion of which is lost by conduction to the engine.
If one could have air going in and coming out at the same temperature and a block/head that had 0 heat transfer properties as well as engine internals which didn't increase friction as the load increases, then the higher compression shouldn't count. Of course such an ideal engine doesn't exist.
As I see it, increased compression results in extra heat, some of which heats up the air in the pre cup and some of which is lost into the engine with corresponding loss of efficiency. I think mfr's run them high as the efficiency loss may not be huge (I'm thinking 5% between as low as it will go and run compred to say 25:1) and it ensures cold starting running.
In trying to build an 'efficient' car, that possible 5% from a low CR, with around 5% tops from ceramic coating, 5% from aerodynamic mods, 3% from low friction fluids and 5% from narrower low resistance tyres may take my 71mpg and make it 87mpg ;-) Add in a turbo and the increased efficiency it brings, or low resistance exhaust/ram air inlet/porting to work on pumping losses and perhaps some more? Goal is 100mpg... Remove brake vacuum pump and go electric, greatly reduced parasitic load there - spinning a brake vacuum pump at 1500rpm (camshaft speed, 3k engine) for 2 hours on the motorway when you don't use your brakes is a big load!.
As far as I know, the higher the compression ratio, the higher the efficiency
This is what is often said, the purpose of the thread is to challenge it. I think I have made enough points so far, anyone with any more reasons 'why' a high cr is supposed to be more efficient? Given that a lower CR will perfectly heat, swirl and ignite the mix? I go back to my point about the tdi's, why don't VW put the cr up to 25:1 on them? I suspect because all they 'need' to ensure a good fuel burn is around 15:1, so they make it 18:! or something to ensure good starting in cold - knowing that any further increase just lowers efficiency and reduces hp and mpg as the power to do the compression and produce that heat which is lost comes from the power stroke, sapping the engine.