How IEM systems
made ERP outdated >


As to the productivity — everything is easy to understand. Let’s face the facts:

  • 4000+ concurrent active users  (nobody has inquired a bigger amount, so far).
  • 18 000 concurrent active users at the development testing.  

    In September 2013 the Ultima company and the company named FORS Distribution, which represents the interests of the Oracle Corporation on the territory of the Russian Federation (and in Mongolia, which is not less important), held  a joint testing of the Ultimate e-Trade® solution, based on the second  generation of the Ultimate Solid platform.

    The testing was performed with a subsequent increasing  load, starting with 1500 concurrent users.
    The maximum load that marked the final testing execution  was equal to 18,000 concurrent users.
    The testing with such a load was performed on 6  application servers and Oracle Exadata Database Machine X2-2 with the detailed  RAC consisting of two 24-core virtual servers.
    8 drives in RAID 10 and 2 SSD drives in RAID 0 were  used as the on-disk storage.
    The average load of the applications servers was 85%, and  the average database server load constituted 70%.

    To check the  actual user experience, several operators performed the tests scenarios  manually — from the main client application — and assessed the actual delays. When  doing that, the operators noted that saving of the clients’ orders took no more  than 1-2 seconds, the report creation took no more than 2 seconds, and the display  of the goods’ table (with the current balance, stock, prices and other  analytical information) took no more than 1 second.


    • he users’ operation remained quite comfortable even upon reaching the threshold of 18,000 users;
    • The mentioned 18,000 users constitute the technical  threshold with regard to the restrictions of the conducted testing, which does not constitute the performance  limit of the tested Ultima solution provided the increasing server capacities

    We would like to note that, in contrast to the usual market practice of testing  the “spherical horses in a vacuum” (and the  vacuum settings are usually carefully selected for the maximum exploitation of  the strengths, and the masking of the weaknesses), the subject testing  emulated the load of work performed by the real users: their activity profiles were  taken from the real database of one of our customers, the largest online-retailer
    in the Russian Federation.

    All the basic steps of the real users were emulated during the testing  process — the creation of orders (minding the available balance), their  handling and preparation at a storehouse, shipping, restocking, moving between storehouses  to maintain the goods’ range, and other routine business activity.
    The goods’ distribution between the storehouses and within the product  categories was maintained as close to reality as possible.

    The orders to the suppliers (aimed at the storehouse replenishment)  were created taking into account the current company turnover.
    Likewise, the orders to replenish the stock in stores/points of shipment/sales offices were formed depending on the actual sales from  the storehouse, and its limits (overall weight and size, value etc. )

    FORS Distribution press  release created upon the testing results

    Click here for  details.

Comparison with the products created by our colleagues

Ultimate Solid with the e-Trade configuration Microsoft Dynamics® AX  2012 SAP R/3

Concurrent users amount 18 000 5 000
Time spent to create a  purchase order 150 positions  in 1,28 sec 5 positions  in 12,29 sec
Time spent to create a sales order 5-8 positions  in 0,9 sec 5 positions  in 9,15 sec
Application-server 6 servers
2 Core CPU 2.2 GHz,
2 Gb RAM
10 servers, 12 cores each, 16 Gb RAM

Database-server Oracle Exadata Database
Machine X2-2 48 Core CPU,
32 Gb RAM
12 core (48  cores),  
2.2 GHz AMD Opteron,  
256 Gb RAM

Extended description of the Ultimate Solid testing PDF, 400Kb

Extended description of the Microsoft Dynamics AX 2012® testing  PDF, 400Kb

Extended description of the SAP R/3 testing. . .   wait a minute. . . what the hell. . . and
where is the official data on the SAP performance?. . . Or  non-official. . . Any kind of data on the (real) productivity of the famous “market leader”?

Dearest readers!
Herewith we, to our deepest shame, have to  admit: we have NO data on the SAP productivity.
The three-times honored “market leader”, notable for its tons of PR gigabytes, is  modestly keeping silence about THIS aspect.
Hey! Have you had the same thought? Oh yeah.

It’s official: having spent a great lot of our time and money, trying to discover the relevant information, we found nothing. None of the “sapper-pioneers” —   no matter what confidentiality guarantees —   could say anything comprehensible.

That is why:

The one, who will present us with the creditable information (comparable to the aforementioned data as to  its contents) about dead or alive the real SAP productivity, will be immediately issued a certain amount of green cookies produced by the U. S. Department of State; the amount of the bags full of cookies will be discussed and agreed upon.

Those, who are longing to get the cookies, please write to

Your privacy is guaranteed.

Update 01.09.2014

And here are the first little birds, eager to try the cookies. A little swallow brought us a letter on its tail. A well-wisher, whose privacy is guaranteed, sent us a couple of links, namely:

SAP SD Standard Application Benchmark  Results, Three-Tier Internet Configuration

Testing clarification (English)

We have to admit, to our shame, that, notwithstanding all our efforts (extensively spent on finding the information  about the SAP productivity) we did not find this entry on the SAP web-site.

The shameful blow was a bit softened by  the two following facts:
— all our acquaintances, who are dealing  with SAP to a certain extent, found the fact of existence of this  extra-important information to be quite a revelation too (sic!);
— we suggest that any particular reader should independently evaluate the user-friendliness and easy-to-usefulness of the SAP corporate website with regard to finding any practical and useful  information. And we gather that this “humming” is not a mere coincidence!

In any case, Holy Cross the financial interest expressed by greedy birdies creates true miracles (“Oh, nightingale, the nicest Russian bird!").

But more to the point.
So what can we see, having clicked on the above-stated links, kindly buried in the  nightmarish depths of the gloomy PR Himalayas created by the darkly brilliant and  obscure website developers of the Teutonic company?
If you subtract the amount of the random noise, diligently crammed in there in order to add to the complexity of perception and, consequently, to increase the pseudoscientific effect, then. . .

. . . The official review given by Ultima on the SAP “testing” (and, simultaneously, a decisive  kick-in-the-ass to the first greedy birdie) will be as follows:  

the above-mentioned link actually contains the results of the testing of the performance of the database server disk subsystem. Not the testing of the ERP-system performance in the conditions close to the reality.
The modeling of the enterprise operation (including all the reasonable assumptions of
the “spherical horse in a vacuum”) is NOT happening during the said “testing”. On the contrary, we see the results of working out some absolutely synthetic procedure which excluded the occurrence of any possible  database blocks that are, actually, the only obstacle to the infinite  scalability of the system.
Meanwhile, it is the ability to avoid this blocking that constitutes the main factor, determining the true performance of an ERP system in real-world conditions.

“Twist of the wrist and no cheating”?
Deserves not more than a “С” grade. In general, it is quite a slap-up job, aimed at the rather incompetent audience.
That is why it was hidden so deep.

To cut the long story short: we continue accepting hints from those who love green cookies.