Developers Club geek daily blog

Unpacking of the data squeezed by algorithm of Deflate with the fixed Huffman codes on the example of the PNG format

2 years, 9 months ago

Within the next laboratory work we with colleagues faced a problem of analysis of a hexadecimal dump of the PNG file. According to the RFC 2083 standard the PNG format stores the pixel data squeezed by algorithm of Deflate. Therefore at analysis of a dump we needed to unpack compressed data algorithm of Inflate.

Read more »

The narration about the compressor which it is possible to call, but I do not remember how

2 years, 9 months ago
Not absolutely New Year's history in which there is a tie, an intrigue, detective investigation, a pursuit, insidiousness, wisdom ancient and the happy final is provided to your attention. Under a cat you are expected by archeological excavations of Habr of an era of reorganization and a pinch of the x86 assembler to taste.

Read more »

EMC ViPR 2.1: data management of "the third platform"

2 years, 11 months ago

ViPR – an element of the program defined data processing center

ViPR implements for a data storage segment approximately the same that VMware made for a segment of servers – creates a possibility of abstraction of resources, forming of pools and implementation of automation for infrastructure. By means of VMware API interfaces the storage pools created in EMC ViPR are presented in VMware vSphere in the form of a simple array. Besides, the ViPR controller provides integration with VMware vStorage API for Storage Awareness (VASA), vCOps, and also with control facilities and orkestration of VMware SDDC, vCloud Automation Center and vCenter Operations Manager. Thus, in ViPR control of storage can be exercised as independent object which is represented per se in virtual environments of Microsoft and OpenStack, and within the program defined VMware data processing center.

Main goal of development of EMC ViPR was simplification and reduction in cost of management of the existing heterogeneous infrastructures of storage, and also creation of a simple data management system and data access in the distributed clustered file systems, for example, based on hadoop-clusters, and also in cloudy environments.

Read more »

Raskovyrivayem compression of resources in Might and Magic III

3 years, 2 months ago
Not really well I remember how I have appeared in debagger of DOSBox and why I was picked the 16-bit assembler, recovering function of unpacking of the resource MM3.CC files – but it was healthy. Game at me has appeared on humble bundle, some of the last sales, and then in network I have come across Jeff Ludvig's page where the problems with game modification connected with compression in MM3.CC were described. In particular, the following has been written there:
It has appeared that it is quite difficult to crack this algorithm, and for the present nobody has learned to unpack these data.

The call has been accepted. In its article it is painted as he tried to struggle with algorithm. I will paint as it was done by me, and at the end I will give the reference to the utility with the open code which is able not only to unpack, but also to pack the MM3.CC file file.

DOS Packer

Having looked at MM3.EXE, I have found out that it is compressed executable file of DOS, with certain uncompressed overlay at the beginning of which costs FBOV. I knew nothing about Dosovsky compressors, but I have spotted at Jeff Ludvig that he uses thing under the name "Universal Program Cracker" v1.11. I have found version 1.10 (issued on June 25, 1997) and have unpacked to ekha. And I managed even to process data of overlay correctly. And all the same I wanted to learn the name of pakovshchik. Have prompted to me that it is necessary to use the Detect It Easy program, and it is valid – it has given out:

EXECUTRIX-COMPRESSOR(-)[by Knowledge Dynamics Corp]
Borland TLINK(2.0)[-]

For fans of history I can recommend old branches of the discussions concerning this software – from 1991 and 1995:


Read more »

Who presses better, or Walsh against Fourier

3 years, 2 months ago
Despite development of science and equipment, information compression on former remains to one of actual tasks where the special place is taken by video information shrinking algorithms. In this publication it will be a question of compression of freeze frame color images JPEG-like algorithms.

For a start I want to thank the author of the articles "Decoding of JPEG for Teapots" and "We Invent JPEG" which have very much helped me with work on writing of this publication. When I have dealt with issues of studying of shrinking algorithms of images with losses, regarding algorithm of JPEG I was tormented all the time by question: "Why the part of basic conversion in algorithm of JPEG is assigned to special case of Fourier transform?". Here the author gives the answer to this question, but I have decided to approach it not from the point of view of the theory, mathematical models or program implementation, and from the point of view of circuit engineering.

The shrinking algorithm of JPEG images is algorithm of digital signal processing which, are, as a rule, hardwired either on digital signal processors, or on programmable logic integrated circuits. In my case, the choice would mean arrival for operation of the digital signal processor to from what I tried to leave — to program implementation therefore it has been decided to stop on programmed logic.

In one of online stores I have got quite simple debug payment which contains in the structure of PLIS Altera FPGA Cyclone 4 - EP4CE6E22 and memory of SRAM 512Kx8. As PLIS was Altera firms, and for development has been decided to use the Quartus II Web Edition environment. Development of separate functional blocks of the hardware codec was carried out in the Verilog language, and assembly in the uniform scheme was made in the graphics editor of the Quartus II Web Edition environment.

For communication with the personal computer (reception of teams, reception/transfer of the processed data), in the Verilog language I have written the unpretentious asynchronous UART transceiver. For connection to COM port (RS-232) of the computer, on the basis of chip of MAX3232WE and from this that was near at hand, hastily, the transformer of levels has been soldered. As a result here that has turned out:

Read more »

Overview of a new line of data storage systems of HP 3PAR

3 years, 2 months ago
Often the choice of SHD is similar to the choice of a cat in the black room, and there is no strong wish that after few years of work it turned out that the low technologies not ready to new tendencies and not having "margin of safety" for scaling in the future are behind a beautiful facade long ago.

In our opinion the storage system of 3PAR turned out very successful and having potential for implementation of new fashionable trends of the market of data storage systems, such as an on-line deduplication, a compression of data, etc. We are right or not — will show time. In the beginning several common word about 3PAR storage systems. The presentation from which pictures to this article – only a few weeks are taken as was published. We, generally will concern changes and innovations in system 3PAR in the general system and infrastructure of SHD.

Than the new HP 3PAR system is good?

– about it under kat

Read more »

PostgreSQL and btrfs — an elephant on an oil diet

3 years, 3 months ago
Recently, browsing article on Wicky about file systems, became interested in btrfs, namely its rich opportunities, the stable status and the main thing — the mechanism of transparent data compression. Knowing as the databases containing text information easily press close, it became curious to me to specify on how many it it is applicable in the scenario of use for example with postgres.

Of course it is impossible to call this testing complete because only reading and that linear is involved. But results already force to think on possible transition to btrfs in certain cases.

But the main objective — to learn opinion of community on that, on how many it it is reasonable also what reefs can conceal in itself approach of transparent compression at the file system level.

For those who do not want to spend time I will tell about the received outputs at once. PostgreSQL DB placed on btrfs with the option compress=lzo reduces DB volume in two (in comparison with any FS without compression) and when using multithreaded, consecutive reading, considerably reduces load of a disk subsystem.

Read more »

Effective size variation of pictures by means of ImageMagick

3 years, 3 months ago
Presently even more often the sites face need of introduction of sympathetic design and sympathetic pictures – and in this regard there is need of effective size variation of all pictures. The system has to work so that to each user the picture of the necessary size – small for users with small screens, big – for big screens on demand went.

The web thus works perfectly, but for delivery of pictures of the different sizes to different users it is necessary to create all these pictures at first.

The set of tools is engaged in size variation, but too often they issue the big files canceling speed gain which has to come together with sympathetic pictures. Let's consider how by means of ImageMagick, the tool of command line, fast to change the sizes of pictures, saving excellent quality and receiving files of small volumes.

Big pictures == big problems

The average web page weighs 2 Mb, from them 2/3 – pictures. Millions of people go to the Internet through 3G, or is even worse. the 2mb-sites in these cases work awfully. Even on fast connection such sites can spend traffic limits. Work of web designers and developers – to simplify and improve the user's life.


Very small sites can save some versions of all pictures simply. But what if you have their dofiga? For example, in shop there can be one hundred thousands pictures – not to do their options manually.


The command prompt utility with 25-year experience at the same time is the editor of pictures with complete set of functions. In it huge heap of functions, and among them – fast and automatic size variation of pictures. But with default settings files often turn out excessively big – sometimes on volume more original, though in them less pixels. Now I will explain in what problem, and I will show, what settings are necessary for its solution.

Read more »

Optimum sorting of continuous archive

3 years, 7 months ago
Embodiment of one idea – to locate files so that the size of archive was minimum.
The program checks compressibility of files in couple and then sorts the list for compression by the archiver.
If to whom it is necessary – take.

Read more »

Processing of the big packed files on Mac and not only

3 years, 8 months ago
There was at me somehow a task to process the file with logs. In principle, a task banal, I for this purpose use Perl and in Linux and in Windows. But the matter is that all this occurs on Mac, the file is in archive and it is big. Unpacked, it occupies about 20 GB.
What normal solution will be?

Read more »