## Python Meetup October: Deliberate Practice and desktops-applications on Penta.by

2 years, 11 months ago
Hi everyone!
We hurry to share videos of performances from the scheduled meeting of the Minsk Python-community.
Under a cat you will find reports:
• Deliberate Practice: Coding Dojo, Code Kata and Coderetreat / Sergey Sergienko
• Rapid development of desktops-applications with Penta.by / Andrey Puchko
• WRK: Modern HTTP benchmarking tool / Alexey Romanov

Pleasant viewing!

## Simple metasearch algorithm on Python

2 years, 11 months ago

#### Lyrical digression

Within research work in higher education institution I faced such task as classification of text information. In fact, I needed to create algorithm which, processing a certain text document on an input, would return me on an output an array which each element would be a measure of accessory of this text (probability or degree of confidence) to one of the set subjects.

In this article it will be a question not of a solution of a problem of classification specifically, and of attempt to automate the most boring development stage of the subject heading list — creation of the training selection.

#### When laziness to work hands

The first and the most obvious to me thought – to write simple metasearch algorithm on Python. In other words, all automation is reduced to use of issue of other search engine (Google Search) for the lack of the databases. At once I will make a reservation, there are already ready libraries solving a similar problem, for example pygoogle.

## Neuronet on Python, part 2: gradient descent

2 years, 11 months ago
Part 1

#### Give a code at once!

``````import numpy as np
X = np.array([ [0,0,1],[0,1,1],[1,0,1],[1,1,1] ])
y = np.array([[0,1,1,0]]).T
alpha,hidden_dim = (0.5,4)
synapse_0 = 2*np.random.random((3,hidden_dim)) - 1
synapse_1 = 2*np.random.random((hidden_dim,1)) - 1
for j in xrange(60000):
layer_1 = 1/(1+np.exp(-(np.dot(X,synapse_0))))
layer_2 = 1/(1+np.exp(-(np.dot(layer_1,synapse_1))))
layer_2_delta = (layer_2 - y)*(layer_2*(1-layer_2))
layer_1_delta = layer_2_delta.dot(synapse_1.T) * (layer_1 * (1-layer_1))
synapse_1 -= (alpha * layer_1.T.dot(layer_2_delta))
synapse_0 -= (alpha * X.T.dot(layer_1_delta))
``````

#### Part 1: Optimization

In the first part I described the basic principles of the return distribution in a simple neuronet. The network allowed us to measure how each of scales of a network makes the contribution to an error. And it allowed us to change weight by means of other algorithm — gradient descent.

The essence of the events is that the return distribution does not enter optimization to network functioning. It moves incorrect information since the end of a network on everything weight inside that other algorithm could already optimize these weight so that they corresponded to our data. But in principle, we have also much other methods of nonlinear optimization which we can use with the return distribution:

## Calibration of Kinect v2 by means of OpenCV on Python

2 years, 11 months ago
Not so long ago we began couple of projects in which the optical system with the channel of range is necessary, and decided to use Kinect v2 for this purpose. As projects are implemented on Python, for a start it was necessary to force to work Kinect from Python, and then to calibrate it as Kinect from a box enters some geometrical distortions to frames and gives centimetric errors in determination of depth.

Before I never dealt neither with computer sight, nor with OpenCV, nor with Kinect. The exhaustive instruction as to work with all this economy, I did not manage to find too so as a result it was necessary to tinker rather. And I decided that I not superfluous will systematize the got experience in this article. Perhaps, it will be reasonably useful for some sufferer, and still we need popular article for show in the reporting.

## Django: How quickly to receive unnecessary duplicates in simple QuerySet

2 years, 11 months ago
I welcome all!

Just found an interesting bug (a bug from the point of view of human logic, but not the machine), and decided to share it with community. I program on django already long enough, but faced such behavior for the first time so, I think, yes it is useful to somebody. Well, to business!

Let us in a code have such primitive piece:
``````# views.py
ids = [5201, 5230, 5183, 5219, 5217, 5209, 5246, 5252, 5164, 5248, ...<и т.д.>...]
products = Product.objects.filter(id__in=ids)
``````

The received goods about the help of a pagination are displayed on the corresponding page on 20 pieces. Once the manager calls and says that the goods "jump" according to pages — at first it was noticed on the second page, and then suddenly repeats on the fifth.

"Ha" — we declare, we put breykpoint after the specified block of a code and we do print(products). Visually and, for correctness, a cycle we check an output — and there are no duplicates!

## History of one optimization: transfer and processing of results of fight

2 years, 11 months ago
Today I will tell you about small part of the big project — World of Tanks. Many of you, probably, know World of Tanks from user side, I suggest to look at it from the point of view of the developer. In this article it will be a question of evolution of one of technical solutions of the project, namely — transfer and processing of results of fight.

## Django 1.9 release

2 years, 11 months ago

## Python 3 library for connection to ESIA — esia-connector

2 years, 11 months ago
Everything began with what the Ministry of Telecom and Mass Communications permitted to use a portal of state services for identification and authentication of users on non-state websites. It is implemented by means of service ESIA (The Single system of Identification and Authentication — esia.gosuslugi.ru). The customer of our project was among the first 5 participants who submitted applications for integration with ESIA that was expressed for us by a task this integration to support.

In free access we did not find an open free solution suitable for the stack of technologies therefore we after development, from the customer's favoring, decided to share own (BSD license).

So, we submit you the esia-connector project, it is written on Python 3, uses the utility of openssl, it was checked in work only in Debian-based systems.

Packet: pypi.python.org/pypi/esia-connector
Project: github.com/saprun/esia-connector

## PythonDigest — release number 100 and other remarkable news

2 years, 11 months ago
Python — is live! There is a wish to begin article with such loud statement.

This week there was 100th Python Digest. in view of that we decided to sum up the results of work on the digest and to tell about the tendencies revealed in the course of collecting of news on language to which actually and the digest is devoted.

At the end of 2014 we summed up the results of http://habrahabr.ru/post/247067/ in a year. This time we will consider tendencies for incomplete 11 months 2015 years.

But before we will pass to tendencies — we will tell that made for (almost) year.

Interested in details welcome under kat.

## We need not so powerful programming languages

2 years, 11 months ago

Today many systems and programming languages are positioned as "powerful". It is impossible to tell that it is bad. Almost each of us considers it as attraction. But in this post I want to inform of such point of view that in many cases we need less powerful programming languages and systems. But before continuing, I will specify: there will be few original, my own reflections. I will state the train of thought which arose after reading of the book of Douglas Hofshtadter "Gödel, Asher, Bach" which helped me to aggregate the separate ideas and thoughts wandering in the head. Also great influence on underwritten material was exerted by Filip Vadler's post and video from the Scala conference. Key thought is as follows:

Each increase in expressiveness assigns additional load of all who want to understand the message.

And I want to illustrate only this point of view by means of examples which will be closer and more clear to community of programmers on Python.