19-04-2008, 05:35 PM
I just never went to sleep two nights ago so after about 30 hours of being awake, I was too tired to do much and thus that rant was born. 
I've spent a lot of my time programming just optimizing stuff. While this has taught me a ton, it has also resulted in a lot of wasted time. On my recent project, I have been trying to change things a bit. I now write out something quickly, often with a very poor design until it is working. At that point, I do some of the basic optimizations while refactoring it. After testing it to make sure it works, I never go back to it until I profile my code and the times it is executed and how long it takes is just too high, or when I get to working with the code again and realize it needs to be redesigned.
I think its good to know the relative performance of certain functions, along with alternative ways to do it, for more broad concepts. For example, if you are making a high performance server in .NET, do you use BinaryReader/Writer and streams? Or do you make your own? What about cropping out unused bits? Who cares, you can inline replace it later since as long as it functions properly. But what type of networking scheme do you use? This actually affects a huge chunk of the networking and is best if you use the correct concept right. If you initially use threaded, blocking sockets, you will have to rewrite it all later to something more like async sockets which is a huge design change. But as long as you are using async sockets, you don't have to make the performance of its components great (ie you don't have to worry about your send queues or receive buffer performing well). So I guess make sure you use high performing designs when you know you will need them because they often will require large rewrites, but don't optimize until you do need it because optimizing is often small and contained rewrites.

I've spent a lot of my time programming just optimizing stuff. While this has taught me a ton, it has also resulted in a lot of wasted time. On my recent project, I have been trying to change things a bit. I now write out something quickly, often with a very poor design until it is working. At that point, I do some of the basic optimizations while refactoring it. After testing it to make sure it works, I never go back to it until I profile my code and the times it is executed and how long it takes is just too high, or when I get to working with the code again and realize it needs to be redesigned.
I think its good to know the relative performance of certain functions, along with alternative ways to do it, for more broad concepts. For example, if you are making a high performance server in .NET, do you use BinaryReader/Writer and streams? Or do you make your own? What about cropping out unused bits? Who cares, you can inline replace it later since as long as it functions properly. But what type of networking scheme do you use? This actually affects a huge chunk of the networking and is best if you use the correct concept right. If you initially use threaded, blocking sockets, you will have to rewrite it all later to something more like async sockets which is a huge design change. But as long as you are using async sockets, you don't have to make the performance of its components great (ie you don't have to worry about your send queues or receive buffer performing well). So I guess make sure you use high performing designs when you know you will need them because they often will require large rewrites, but don't optimize until you do need it because optimizing is often small and contained rewrites.