Let’s do an experiment…let’s try opening Microsoft Word (no matter what version you are using) along with “API Monitor v2”. Let’s select the Office process from the list of “Running Processes” and start analyzing it. We enable, however, only the intercepting of “Component Object Model (COM)” methods.
We immediately begin to see how much of the COM ecosystem is used just by keeping our Microsoft Word open, let alone starting to write or, even, embed an object.
INTRODUCTION
The COM world is thought to be obsolete and no longer used, but, as we will discover in this series, the truth is quite different. Through the understanding of such items as DDE, COM Objects, DCOM, ActiveX, OLE and MSRPC, you will attain a unified awareness of all these technologies at the end of this journey, from an offensive (which I like best) and why not, defensive perspective as well. If you would then like to find 0day on the Office suite, welcome, know that much of it is based on these technologies.
If you would like to accompany me, as if we were on an afternoon stroll, in having a chat and reflections, the introductory part of the course talks more about “philosophy” than “material technical notions.” If you enjoy discussing these theoretical topics as well, by all means read it. Otherwise you are free to skip to the initial technical chapter.
GENERALIZATION AND ABSTRACTION
HISTORY
Ever since Neolithic times, our ancestors have had the need (and thus the desire) to “not exert too much effort”. If we think about it, all technological progress starts from the desire to “economize” energy, to “create” methods and tools to be more productive with less effort.
To create any tool, be it a work tool or any other innovation, it is necessary to “schematize” how the world works, through generalizations and abstractions.
Let’s look at them in detail. I found on a StackOverflow QA a nice explanation of these two words followed by another great illustration:
While abstraction reduces complexity by hiding irrelevant detail, generalization reduces complexity by replacing multiple entities which perform similar functions with a single construct.
Here is the definition according to an italian encyclopedia, Sapere:
GENERALIZATION: (in philosophy) the act, or procedure, by which the properties of one or more elements of a class are extended to all the elements of that class. Generalization constitutes the raison d’être and end point of all deductive reasoning.
ABSTRACTION: a process by which the human mind constructs universally valid concepts through the analysis of particular elements by isolating them from spatiotemporal context
Although the definitions and differences of these two terms are discordant, the key word here is “simplification”. In fact, these two mechanisms arrive to focus more and on fewer elements than the actual ones, depending on the purpose to be achieved.
Assume that we are in prehistoric times and we have to grind thick grains of any grain. We have long been doing this with our bare hands, and we have realized that this causes us pain and wear and tear.
At this point then it occurs to us, what can I do to grind these grains quickly? Maybe by not breaking our hands?
GENERALIZATION: having had direct experience grinding grains, we generalized their “average hardness”: if every time I grind the grains, they break with a firm punch, then all the grains I have collected and will try to break will break in this way.
ASTRACTION: to find a way, we then think that we need “something hard and tough,” so that we can beat the grains and break them without breaking the tool.
At this point, through cognitive processes, comes insight, a solution indicated given the initial conditions.
We then used the pillars of our rational thinking, generalization and abstraction, to innovate and economize.
Beware, however, of early recognition of the various logical fallacies and biases arising from the desire to arrive quickly and with less effort at a solution.
EVOLUTION
Over time these “unconscious” expedients have been brought to awareness and formalized.
Let us fast forward to modern times…
In the field of development, many principles have been devised in this primitive wake of “the desire to schematize and economize how reality works.”
DRY (The Pragmatic Programmer, 1999)
KISS (U.S. Navy, 1960)
DESIGN PATTERNS (Design Patterns: Elements of Reusable Object-Oriented Software).
Not to mention the emergence of programming paradigms more advanced than procedural programming:
Functional Programming
But, in all this, where does Windows fit in? 🤔
WINDOWS: FROM PROBLEM TO SOLUTION
The solutions we will see in this series grew out of Microsoft’s need to find a method of inter-process communication that was “standard”, that is, one that guaranteed interoperability between programs written in different languages, modularity and reuse.
In fact, in the beginning, named pipes, sockets, shared files or shared memory had to be used to achieve IPC, but totally independently and at the discretion of the programmer on duty. This therefore resulted in a variety of different implementations, written moreover in different languages, and difficulties in debugging the codes.
To perform IPC, the Word backend programmer had to know internal details of file, socket, and named pipes management.
Thus, the difficulties were:
- Lack of a standard framework
- Primitive and inflexible implementations
- Interoperability problems
- Complicated errors and debugging
- Limitations of cooperative multitasking
TO BE CONTINUED
I know I have bored you with philosophical arguments, however, from the next post I will begin in a technical way to talk about the most common IPC mechanisms and, later, the first solution devised by Microsoft Windows for this problem: Dynamic Data Exchange (DDE).