CodeSatori
@DolphinBugFixing: Exactly — what counts the most is the level of active usage. The performance difference between 100 members and 50,000 members outside active usage is marginal, as it's only data stored in the database and files stored in the filesystem. The effect of large tables in themselves is minimal in comparison to how they are accessed.

1. How to reduce the amount of queries per page/script access? By optimizing the code. And really, there's no shortcut for that — you really need see more to crack the files open, see how and why the queries are there, think how you can normalize / otherwise revise the database design and the queries to eliminate all unnecessary overhead, and introduce internal software caches as necessary to further eliminate recurring queries on unchanged data. (Even with a sophisticated MySQL cache setup, you will still have a bunch of possibly redundant PHP required to process the overhead data, again counting against your CPU.)

2. How to reduce the impact of the existing queries? By attempting to optimize your MySQL server settings to match your data situation, and installing accelerators/optimizers on your MySQL server.

Much of the talk regarding performance issues has been beating around the bush with #2 above, while #1, which is the real and lasting solution to all of this, has only received attention in passing at best. Don't think #1 is anything you'd want to get into yourself even with adequate skills, unless you intend to only shave off the critical bottlenecks, or unless you have a year plus of hobby time at your disposal. Optimization really isn't something software end-users should be worrying about.
 
 
Below is the legacy version of the Boonex site, maintained for Dolphin.Pro 7.x support.
The new Dolphin solution is powered by UNA Community Management System.
PET:0.037945985794067