The final benchmark that we investigated in our research was the OCUS R20 benchmark.  Designed by Olaf Corten, the OCUS benchmark consists of 17 tests which are broken into 4 sub-categories: CPU, Graphics, GUI and Disk. 

Operations are performed on a single part, which is similar in complexity to the BENCH99 part.  The tests are also similar to BENCH99.  The main difference between these benchmarks is that the graphics operations that the OCUS R20 completes are more in line with what the average user would perform on a daily basis.  The result is that the OCUS benchmark is an excellent benchmark as a result of its dependency on not only a strong graphics subsystem but a strong CPU as well, without biasing results towards strengths in either category.  If the graphics card is kept at a constant, the OCUS makes for an excellent benchmark of CPUs, or a wonderful real world test of graphics cards under Pro/E, if the CPU is kept constant.

Choosing a new workstation configuration based on the results of this benchmark should automatically point you toward a system with good performance.  Anyone with a Pro/E license can run this benchmark.  The test itself requires approximately 150MB of RAM and would run perfectly on a system with 384MB – 512MB of RAM, which is far from being considered extreme in the Pro/E workstation world where systems commonly approach and surpass 1GB memory configurations.

What sets the OCUS benchmark apart from the others is that those that run it are encouraged to send the results in so that they can be posted on the OCUS web page.  For this reason, the OCUS results are also the most current among the benchmarks.  Usually, new results are posted on a weekly basis.  BENCH99 is updated yearly.  SPECapc is updated several times a year.  The most refreshing thing that you will see are test results for the systems you all build, not just the pre-configured Alpha and Xeon workstations but things like home built AMD K6 and Athlon systems as well as tweaked out Pentium III systems. 

The other benchmarks post only results performed by manufacturers whose systems are supported by PTC.  They are mainly showcases for the new product offerings by the major workstation manufacturers.  How often is it that you have the opportunity to compare your home built system to something manufactured by the NTSIs and SGIs of the industry?  The webpage of Olaf Corten, the creator of the OCUS benchmark, provides you with this very opportunity thus making OCUS R20 truly “the [Pro/E] benchmark you can do yourself!” as quoted from Corten’s site.   

However,  the OCUS does have its drawbacks.  The most glaring of which is the lack of any common testing configurations and procedures.  Visiting the OCUS benchmark results page illustrates one major flaw with the way the benchmark results are displayed, little more than the CPU and memory size is ever disclosed. 

Any number of things can effect the results of a benchmark of this type. Among them, differing software builds, screen resolutions, operating systems and video cards.  The result is a lot of information that is not in a particularly useful format.  Even with these drawbacks, the OCUS R20 benchmark has become a very valuable decision making tool for both the Pro/E user and system administrator. 

The OCUS R20 benchmark gave us the opportunity to rectify some of the downsides to the reporting of scores, since the benchmark is open for public use.  We immediately went to work, setting up a few mid-rage NT workstations and outfitting them with everything from Intel’s Celeron and Pentium III up to AMD’s Athlon and Kryotech’s SuperG system with an Athlon running at 1000MHz. 

We ran all of our tests at the exact same settings, under the same configuration for each setup, and thus set the AnandTech standard on how we were going to run the OCUS R20 benchmark for Pro/E.  The results are easily comparable to systems from other Pro/E users simply by following the configuration and settings referenced in our table documenting The Test. 

If any Pro/E users would like to compare their systems to the systems that we benchmarked in this comparison, be sure to use the same amount of memory, run at the same resolution, and use the same video cards that we used in the tests.  If you take care to make sure that these variables remain as close as possible to ours, then you should have no problem making a direct comparison between your benchmarks and what we’re running in house. 

By documenting all of our test settings we hope to further promote the use of the OCUS R20 benchmark as a standard method of comparing performance in Pro/E as well as promote standard configurations under which to run the tests in order to make comparing scores across multiple systems and platforms much easier. 

Before we get to the test description and performance benchmarks themselves let’s take a look at the specifics of the OCUS R20 benchmark. 

BENCH99 & SPECapc OCUS R20 Test Descriptions
Comments Locked

1 Comments

View All Comments

  • dac7nco - Tuesday, June 28, 2011 - link

    My phone is faster than a DEC Alpha; greetings from the future!

Log in

Don't have an account? Sign up now