MAIN PAGE
> Journal "Cybernetics and programming"
> Rubric "Data encryption and data protection"
Data encryption and data protection 
Galanina N.A., Ivanova N.N.  Analysis of the effectiveness of synthesis of computing devices for nonpositional digital signal processing


pp. 16

DOI: 10.7256/23064196.2015.3.15354
Abstract: The article researches methods, algorithms and computing devices for encoding, digital filtering and spectral analysis of signals. The subject of the study is methods of synthesis and analysis of devices for signals digital filtration and spectral analysis in system of residual classes. The proposed article presents efficiency analysis of synthesis of computing devices for nonpositional digital signal processing in the system of residual classes. The authors show results of comparative evaluation of performance of computing devices for digital filtering and spectral analysis. The authors propose a method of increasing the speed of digital devices in the system of residual classes. The research is based on the apparatus of mathematical analysis, mathematical logic, theory of algorithms, theory of algebraic integers, automata theory, the theory of the discrete Fourier transform and fast variations, probability theory, mathematical methods and simulation. The study presents ways of solving the problem of implementation of digital signal processing algorithms in system of residual classes on modern signal processors taking into account peculiarities of the system of residual classes. Implementation of digital devices on digital signal processor intended for data processing in nonpositional number systems, including system of residual classes, is a promising line of development digital signal processing devices.
Sidel'nikov O.V.  Comparison of computational complexity of classification algorithms for recognizing the signs of cyber attacks


pp. 716

DOI: 10.7256/23064196.2014.6.13306
Abstract: The article presents a comparison of computational complexity of two logical classification algorithms: an algorithm of sequential search (brute force) and algorithm of inductive states prediction. Logic algorithms are implemented in Matlab. For comparison of the computational complexity of classification algorithms author uses Zakrevskiy technique. Classification problem is one of the main problems in detection of threats of cyber attacks in the information system. Information about the signs of cyber attacks detection can be received from various sources (sensors) of software and hardware of the information system, for example, antivirus tools, dumps RAM logs, hard drives, user logon information, etc. Each of those sources contain information that can be used to determine the presence of an attack on the system. The article reviews the problem of logical classification of already existing data using two algorithms: an algorithm of sequential search (brute force) and algorithm of inductive states prediction. The use of the adapted method of inductive states prediction allowed to reduce amount of computation and get the average gain K ≈ 9,3 and thereby reduce time of detection of computer attacks.
Piskova A.V., Korobeinikov A.G.  Features of applying lattice theory in digital signature schemes


pp. 812

DOI: 10.7256/23064196.2016.2.17970
Abstract: The subject of the study is the scheme of digital signature, which is an important element in building secure systems used in most realworld security protocols. Reliability of existing schemes of electronic digital signature can be severely lowered in case of developments in classical cryptanalyst or progress in the development of quantum computers. A potential alternative approach is to construct the schemes based on the complexity of certain properties of the lattices, which are supposed to be intractable for quantum computers. Due to significant scientific advances in recent years, scheme based on lattice theory already used in practice and is a very viable alternative to numbertheoretic cryptography. The study is based on the use of methods of lattice theory. This choice is dictated by the lack of solution of problem of finding the shortest vector or finding the nearest vector in polynomial time. The main conclusion of the paper is that the main area of future development in the schemes of the digital signature on the basis of lattice theory is their optimization and implementation of the FiatShamir model in it. For example, Bliss scheme showed high performance and therefore it can be integrated into portable systems and devices.
Khaliullin A.I.  Implementation of the electronic workflow in the activities of law enforcement organizations of the Commonwealth of Independent States


pp. 1216

DOI: 10.7256/23064196.2013.6.10279
Abstract: The article reviews the current state and perspectives of the evolvement of the electronic workflow systems in the activities of the law enforcement organizations of the Commonwealth of Independent States on the example of a multiservice network for secure exchange of criminalistics data between the Ministries of Internal Affairs (Polices) of the Commonwealth of Independent States. Information technologies provide rapid exchange of information between law enforcement agencies of the Commonwealth of Independent States.The multiservice network of the secure exchange of criminalistics data between the Ministries of Internal Affairs (Polices) of the Commonwealth of Independent States is a hardwaresoftware complex that includes software and automated workplace for the specialist.Efficiency of the multiservice network is defined also by the amount of criminalistic data in the database, most actively filled by the Ministry of Internal Affairs of Russia , Belarus and Tajikistan.The author suggests the ways of improvement of the multiservice network for secure exchange of criminalistics data between the Ministries of Internal Affairs (Polices) of the Commonwealth of Independent States regarding maintenance the legal significance of procedural documents transmitted within the network through the use of electronic signatures.
Galanina N.A., ., .  Ways of implementing digital signal encoders using residues in the residue number system


pp. 2136

DOI: 10.7256/23064196.2013.1.8311
Abstract: The article presents an analytical review of ways to implement encoders input residues in the residue number system and justified their selection of optimal structures. Authors evaluates instrumental and time costs considered circuit solutions. The purpose of the study is to consider all possible options encoding input signals in residual classes with the modern element base and full advantage of the system of residual classes, evaluation of hardware and time complexity of these options and the selection and justification of the best solution in terms of the criteria indicated above. Hardware cost of encoders are expressed as logical number of twoinput logical elements, and to an EPROM contained in the form of its information bits in capacity. Instrumental cost of encoders on logic circuits depends on how many parts the input sequence is divided. It is concluded that it is possible to further simplify logical encoders and, as a consequence, reduce hardware expenses.
Sivachev A.V.  Increasing the efficiency of steganoanalysis in the area of discrete wavelet image transformation by analyzing the parameters of the frequency domain of the image


pp. 2937

DOI: 10.25136/23064196.2018.2.25564
Abstract: The object of the study are the methods of stegan analysis in the area of discrete wavelet transformation of an image. The author investigate the influence of the fact of embedding in the region of a concrete wavelet transformation on the values of the coefficients of the regions of discrete cosine transform and image in order to improve the efficiency of detecting the fact of embedding into the discrete wavelet transformation domain. The influence of the fact of embedding in the region of discrete wavelet transformation on certain coefficients of regions of discretely cosine transform and discrete sine transformation of the image is shown. The author proposed to use certain coefficients to improve the quality of training of the support vector machine. Method of research: to assess the effectiveness of the steganoanalysis method proposed in the article using the proposed coefficients, a comparison of the efficiency of image classification with other popular steganoanalysis methods for the wavelet decomposition region is performed. As a steganographic influence, the values of the least significant bits of the coefficients of the discrete wavelet transform are used. Main results of the study is the possibility of using certain coefficients of discrete cosine transform and discretely sinus transformation of the region with the purpose of steganoanalysis in the region of discrete wavelet transformation is shown. According to the results of the study, an original method of steganoanalysis is proposed, which makes it possible to increase the efficiency of steganoanalysis for the LH and HL regions of the discrete wavelet transformation of the image. The obtained results can be used in the development of steganoanalysis systems to provide an effective detection of the fact of embedding into the discrete wavelet transformation region of an image.
Borodin A.V.  The feasibility study on implementation of technology of support of integrity and authenticity of information on paper carrier in case of aloof document handling


pp. 3047

DOI: 10.7256/23064196.2017.1.22192
Abstract: Object of the research in a broad sense is the system of document flow of the commercial enterprise rendering services to the population and using the Internet network as the main environment of communication with the client. At the same time for support of validity of agreements between the enterprise and its clients the traditional "paper" document flow based on delivery of documents on the solid carrier with use of a mail service is used. An object of a research is a process of aloof information processing on client side in conditions when the enterprise as the contractor of the transaction, has no opportunity to control this process. Special attention in article is paid to questions of reasons for economic feasibility of implementation of the offered process of aloof document handling.The systems concept and in particular authoring technologies of the ontological analysis is the basis methodologists of a research. On the basis of the analysis of an ontological domain model the specific technical solution of safety of technological process of aloof document handling is proposed and the event model of this process is synthesized. This model is probed with use of approaches of the algebraic theory of risk.Scientific novelty of a research consists in a unique combination of the technical solutions providing the solution of an objective. The preliminary analysis of the market showed absence of similar decisions in practice of the interested companies. The main outputs of the conducted research is an opportunity and feasibility of use of technologies of aloof document handling as transition stage to completely electronic document management between the commercial enterprise and its contractors of arbitrary nature.
Korobeinikov A.G., Kutuzov I.M., Kolesnikov P.Y.  Analysis methods of obfuscation


pp. 3137

Abstract: Modern computer technology makes it a variety of tasks relevant to the field of information security. For example, for the protection of copyright in the images methods of steganography are used. To solve the problem of proving authorship (or vice versa) code obfuscation techniques are used. Obfuscation (from Lat. Obfuscare  obscure, obfuscate, and English. Obfuscate  make nonobvious, confusing, confusing) or obfuscated code  is the process of bringing the source code or executable program code to the form, that keeps its functionality, but complicates the analysis, understanding algorithms and modification during decompilation. Currently, there are special programs called obfuscators that performes obfuscation to solve the task in different ways. The article discusses the techniques of obfuscation from the most basic to sophisticated polymorphic generators obfuscators performing the mathematical transformation of the code, as well as the relationship of obfuscation and efficiency of program code execution and reduce the size of the program. The authors describe further development of obfuscation techniques.
Dikii D.I., Grishentsev A.Y., SavchenkoNovopavlovskaya S.L., Nechaeva N.V., Eliseeva V.V., Artemeva  Development of neural network module for user authentication based on handwriting dynamics


pp. 5563

DOI: 10.25136/23064196.2018.1.19801
Abstract: The article is devoted to the development and investigation of the structure of the neural network module, which is part of the authentication system for users of various information systems analyzing the parameters of handwriting dynamics. The algorithm for learning the neural network module is also considered. The main task that the neural network module should solve is the implementation of a binary classifier based on input characteristic vectors such as the Cartesian coordinates of the handwriting pattern along the abscissa and ordinate axes, as well as time cuts that allow describing the writing speed of the sample. For the structures of the neural network module considered in the experiment, an experiment was performed in which different volumes of handwriting samples were fed to the input in order to determine the most stable. A mathematical model of a neural network module and a genetic algorithm for its learning are described. The article also provides an overview of the structures of neural network modules that are used in other user authentication software for the dynamics of handwriting. The substantiation of the choice of the module structure based on the results of the experiment is presented. The software implementation of the neural network module is implemented in the Java programming language.
Bashmakov D.A., Prokhozhev N.N., Mikhailichenko O.V., Sivachev A.V.  Application of neighborhood matrices of pixels to improve the accuracy of steganoanalysis of fixed digital images with a homogeneous background


pp. 6472

DOI: 10.25136/23064196.2018.1.24919
Abstract: The article consideres the accuracy of the steganoanalysis using the Weighted Stego algorithm in passive data transmission channel countermeasures using the method of embedding the spatial region of fixed digital images with the RGB color model into the smallest significant bit. The dependence of the accuracy of steganoanalysis by the Weighted Stego method on the fraction of a homogeneous background in the analyzed image was studied. The drop in the accuracy of pixel prediction in the background areas of the image is investigated using the prediction model proposed by the authors of the original Weighted Stego algorithm. The Weighted Stego steganoanalysis algorithm is investigated. The basis for the steganoanalysis algorithm is a model for predicting the pixel values of the analyzed image from adjacent pixels. To assess the effectiveness of the analysis, the BOWS2 collection was used. Embedding information is realized by changing the least significant bits of the image in the spatial domain with a payload of 35 %%. The effectiveness of the methods is determined taking into account the obtained truly positive, truly negative, false positive and false negative values of the classification of images. The fall in the accuracy of steganoanalysis by the Weighted Stego method is shown with an increase in the fraction of a homogeneous background in the analyzed image. The method of improvement of the pixel prediction model in the basis of Weighted Stego is proposed, which allows to level out the drop in accuracy with increasing the fraction of a homogeneous background in the analyzed image. The results of the work are useful to a specialist in the field of information protection in the tasks of detecting and countering a hidden data channel. The obtained results can be used in the development of steganoanalysis systems based on the Weighted Stego algorithm.
Gorbunova E.S.  Dynamic Authentication of Users in Learning Management Systems


pp. 6572

DOI: 10.7256/23064196.2016.4.19517
Abstract: The object of the research is the mechanism of dynamic authentification via keystroke dynamics. The author examines reinforced authentification of users in Learning Management Systems whereas elearning gradually occupying a niche in the education environment. The purpose of the present research is to develop and verify the system of dynamic authentification. Special attention is paid to analyzing biometric authentification and developing architectures of the required system and algorithm for classifying users based on the classifier's parameter learning as well as testing results. The author of the article carried out analysis of methods and algorithms that are used in the field of dynamic authentification and offers his alternative solution of the problem. The main results of the research include: architecture of the user authentification mechanism in Learning Management Systems; and description of the algorithm for dividing users into two classes. In accordance with the obtained requirements for the system, the author implemented the aforesaid mechanism in practice and conducted testing of the mechanism. The test shows that the desired result was obtained for the first and secondorder errors. The mechanism of keystroke dynamic authentification can be used not only in Learning Management Systems but also in other systems similar to the violator's model.
Fayskhanov I.F.  Authentication of users with a stable keyboard handwriting in free text selection


pp. 7286

DOI: 10.25136/23064196.2018.3.25044
Abstract: The subject of the research in this work is a dynamic process of user authentication using keyboard handwriting with free text selection.This process is a regular check of the user on a "friendtoanother" principle: the user entering the text is under continuous monitoring of the system and, in case of noncoincidence of the identification characteristics, the system refuses to continue working.The free sample is understood as follows: the user performs text input based on his current tasks, the system in turn analyzes this work, extracts signs, learns, and in case of inconsistency of characteristics, stops access. The research method used in this work is theoretical, consisting of research, search and calculations. An empirical method is also used, which consists of experiment, comparison, and study. The novelty of this paper is as follows. To date, the most popular authentication method is the password.However, the password gradually displaces the biometric means of authentication. For example, to date, many smartphones are equipped with a fingerprint scanning feature.Nevertheless, despite the effectiveness of this method, the method of keyboard authentication has its advantages: the fingerprint scanning system has the risk of not recognizing the finger, if it is injured, hacking methods of this method already exist and, most importantly, the proposed keyboard handwriting system controls by entering continuously, which will allow first, to prevent an attacker from entering the authentication phase, and also to detect it if, for example, he could, by fraudulent means, gain access to the system.
Prokhozhev N.N., Sivachev A.V., Mikhailichenko O.V., Bashmakov D.A.  Improving the precision of steganalysis in the DWT sphere by using the interrelation between the spheres of onedimensional and twodimensional developments.


pp. 7887

DOI: 10.7256/23064196.2017.2.22412
Abstract: The article contains the studies, which are aimed at improving the precision of steganalysis in the sphere of digital image DWT. The authors analyze the causes of inaccuracy of the modern stegoanalysis methods based upon the support vectors, then they offer the directions for improving the teaching quality. In order to improve the quality of teaching support vectors machine the authors study the interrelation between the spheres of onedimensional and twodimensional DWT and the influence of the changes in the coefficients of the high frequency spheres of the twodimensional DWT upon the coefficient spheres of the onedimensional DWT. The steganographic influence involves the change in the value of the lower meaning bit coefficients of the DWT. Considering the study results the authors develop an original method, guaranteeing greater precision in the sphere of finding incorporated information in the high frequency areas of the twodimensional DWT image. In order to prove the precision of the original method, the authors compare it with some modern steganalysis methods. Experimental results of a comparative study prove that the original method provides for greater precision (generally 1015% higher than other evaluated methods) when detecting the fact of steganographic influence in high frequency areas of HL and LH of the twodimensional DWT. The original method also provides for the same precision in the high frequency HH area, as do other modern methods evaluated in this article.
Komarova A.V., Korobeynikov A.G., Menshchikov A.A., Klyaus T.K., Negol's A.V., Sergeeva A.A.  Theoretical possibilities for combining various mathematical primitives within an electronic digital signature scheme.


pp. 8092

DOI: 10.25136/23064196.2017.3.23364
Abstract: The study is devoted to the algorithms and protocols of an electronic digital signature, providing for the key information properties: its integrity, authenticity and accessibility. This article highlights the problems of modern cryptography and a possible way to solve them via creation of an electronic digital signature that can withstand a quantum computer. The article concerns various mathematical primitives, which can increase the stability of existing cryptosystems when used together. This area of research is a new and promising one for the development of domestic cryptography. The theoretical methods of research used in this article include the theory of computational complexity, the theory of rings, fields and lattices, algorithmic aspects of lattice theory and their application in cryptography, in particular, the complexity of solving systems of linear Diophantine equations, the complexity of finding the shortest nonzero lattice vector And the vector of the lattice closest to the given vector, known approximate algorithms for these problems. We refer to experimental methods of research, such as carrying out statistical calculations and data analysis in the Mathlab mathematical environment, constructing elliptic curves in the mathematical environment of Mathcad, creating software implementations of the algorithm for generating a signature in Python, using precompiled modules from the NumPy library. It is planned to achieve the following results in the future: 1. The development of a methodology for constructing electronic digital signature schemes based on two independent computationally difficult problems; 2. The development of a polynomially complex electronic digital signature scheme based on fundamentally different mathematical primitives; 3. The estimation of the size of safe parameters of the developed EDS protocols; 4. The theoretical model of the growth of calculation time from the length of an electronic digital signature key.
Bashmakov D.A.  Adaptive Prediction of Pixels in Gradient Areas to Raise Steganalysis Accuracy of Static Digital Images


pp. 8393

DOI: 10.25136/23064196.2018.2.25514
Abstract: In his research Bashmakov analyzes accuracy of background area selection in static digital images by using the histogram method as part of steganalysis performed by Weighted Stego Image and WSPAM methods. He examines the dependence of practical accuracy of steganalysis of static digital images by using Weighted Stego Image and WSPAM methods on the kind of prediction model in gradient regions of an image as part of resistance to data transmission channels that use the method of embedding the least significant bit of spatial domain in static digital images with a significant part of homogeneous background. The author analyzes the Weighted Stego steganalysis algorithm and WSPAM modification thereof. To evaluate the analysis efficiency, the author has used the BOWS2 collection. To evaluate efficiency of homogenous background selection, the author has used images selected from a wide range of sources. The information is built in by changing the least significant bits of images in spatial domain with an actual load from 35%. Efficiency of methods is defined based on truepositive, truenegative, falsepositive and falsenegative values of image classification. The author demonstrates the low accuracy of homogenous background selection using the histogram method. The author suggests to select homogenous background using the segmentation neural net and proves its efficiency. He also offers an improved model of pixel prediction in image gradient areas, this model allowing to achieve the highest accuracy of steganalysis. The results of the research can be used to create systems of passive resistance to steganographic data transmission channels that are based on the Weighted Stego algorithm.
Mironov S.V.  Gametheoretic approach to testing compilers for the presence of undeclared capabilities of implementation mechanisms


pp. 119127

DOI: 10.7256/23064196.2017.1.20351
Abstract: The subject of research is mathematical software software certification procedures for information security requirements in view of time constraints, regulatory and design requirements. This essential requirement is the availability of the source code on the test software, which is quite critical for developers as a potential channel formed intellectual property leakage. To overcome this drawback, the technique of testing the compilers on the lack of mechanisms for the implementation of undeclared capabilities to stage software compilation. The research methodology combines the methods of software engineering, theory of possibilities of objectoriented programming, systems analysis, the theory of reliability. The main conclusion of the study is that by forming an optimal set of tests using the mathematical apparatus of the theory of games, spending his compiling and analyzing the control flow graphs and data obtained from the compiler output and built according to the original texts of the tests, we can conclude the presence or absence in the test compiler mechanisms introduction of undeclared capabilities in the compiled software.
Menshchikov A.A., Gatchin Y.  Detection methods for web resources automated data collection


pp. 136157

DOI: 10.7256/23064196.2015.5.16589
Abstract: The article deals with the problem of automated data collection from webresources. The authors present a classification of detection methods taking into account modern approaches. The article shows an analysis of existing methods for detection and countering web robots. The authors study the possibilities and limitations of combining methods. To date, there is no open system of web robots detection that would be suitable for use in real conditions. Therefore the development of an integrated system, that would include a variety of methods, techniques and approaches, is an urgent task. To solve this problem the authors developed a software product – prototype of such detection system. The system was tested on real data. The theoretical significance of this study is in the development of the current trend in the domestic segment, making a system of web robots detection based on the latest methods and the improvement of global best practices. Applied significance is in creation of a database for the development of demanded and promising software.
Mironov S.V., Kulikov G.V.  Technologies of security control for automated systems on the basis of structural and behavioral software testing


pp. 158172

DOI: 10.7256/23064196.2015.5.16934
Abstract: The subjects of the study are the basic methods and principles of testing software systems used in the interest of the safety evaluation and control of automated systems. The study provides recommendations on the methods of software testing for the most common threats to security subsystems such as firewall, audit; access control; integrity monitoring; password and encryption. The authors considered the possibility that the product could contain the following vulnerabilities: buffer overflow, incorrect handling of format means, race problems. The research methods include the methods of the theory of programming, theory of reliability, software engineering, errorcorrecting coding, information security, system analysis. The main conclusion of the study is that software testing is a powerful tool to detect both errors in the software and security vulnerabilities. Modern methods of behavioral testing allow to identify vulnerabilities without software source code and can be used successfully in the Russian market, where accessing the source code for testing purposes is almost impossible.
