![]() You can replace expensive operations with cheaper ones:Įxercise: Think up at least a dozen more of these. Remember we can always write trees as strings, so we’ll do that from now on: (NEG (- 6 (* 2 8))) Why wait until the target program is run to do simple computations? Let the compiler do it. A pretty good reference Examples Constant Folding.Agner Fog’s Software Optimization Resources.Wikipedia’s Compiler Optimizations Category Page.Careful instruction selection, using an understanding of caches, pipelines, branch prediction algorithms, scheduling rules, alignment requirements. ![]() Optimize the use of pipelines and coresīut note the need for tradeoffs! You can get rid of some jumps by unrolling loops, but this makes more code.Reduce conditional jumps (they make branch prediction difficult).Reduce the amount of code (e.g., remove dead and useless code).Maximize the use of registers, caches, and memory.Reuse results rather than recompute them.The programmer profiles the running code and finds out where the memory leaks and bottlenecks and excessive resource consumption are occurring, and rewrites the code accordingly.The run time system can adapt to the way a program is running, possibly recompiling and relinking on the fly.The compiler can make a pass or two or three over the target code to improve it.The code generator can be smart, with clever instruction selection, by employing register tracking, etc.The compiler can make a pass or two or three over the intermediate code to improve it.The intermediate code generator can be smart about how it generates intermediate code.Not many (if any) compilers can turn bubblesort into quicksort. The programmer needs to find a better algorithm.There is no single optimization phase in a compiler. Is actually smaller or faster Where Can Optimization Be Done? Intraprocedural optimization (improving across basic blocks, but still within a function, e.g., loop optimization using data flow analysis).Attempt near-optimal code within basic blocks (by removing redundancies like common subexpressions, doing smart instruction scheduling and register allocation, etc.).Only few basic simplifications on AST nodes or tuples (peephole optimization). ![]() Still, there are different degrees of optimization aggresiveness: The general rule is to do those things that give the most improvement for the least amount of work, so for instance, you might see compilers that don’t bother doing any improvment outside loops. So we normally speak of code improvement. General optimization is undecidable, and many specific optimization tasks happen to take provably exponential time or are NP-Hard.
0 Comments
Leave a Reply. |