Ýëåêòðîííûé æóðíàë Êèáåðíåòèêà è ïðîãðàììèðîâàíèå - ¹6 çà 2018 ãîä - Ñîäåðæàíèå, ñïèñîê ñòàòåé - ISSN: 2644-5522 - Èçäàòåëüñòâî NotaBene
ïî
Cybernetics and programming
12+
Journal Menu
> Issues > Rubrics > About journal > Authors > About the Journal > Requirements for publication > Council of Editors > Peer-review process > Policy of publication. Aims & Scope. > Article retraction > Ethics > Online First Pre-Publication > Copyright & Licensing Policy > Digital archiving policy > Open Access Policy > Article Processing Charge > Article Identification Policy > Plagiarism check policy
Journals in science databases
About the Journal

MAIN PAGE > Journal "Cybernetics and programming" > Contents of Issue ¹ 06/2018
Contents of Issue ¹ 06/2018
Question at hand
Chickrin D.E., Egorchev A.A., Briskiy D.V., Zakirov R.I. - Methods for obtaining and processing data from a bundle of downhole modules obtained by vertical seismic profiling in the software for controlling the complex for receiving seismic signals in a well pp. 1-10

DOI:
10.25136/2644-5522.2018.6.28091

Abstract: The object of research in this article is a system for receiving seismic signals in a well, carried out by the method of vertical seismic profiling. The subject of the research is data processing methods from a bunch of downhole modules obtained by vertical seismic profiling in software for controlling and controlling a hardware-methodical complex for receiving seismic signals in a well. The authors consider in detail such aspects of the topic as the complexity and speed of seismic data processing algorithms obtained from borehole and surface modules. The authors use the following scientific methods, namely: mutual correlation in the time and frequency domains. The novelty of the results are the conclusions that in the considered complex the correlation in the spectral region provides a gain in the number of operations on the correlation in the time domain. The calculation in the time domain gives a more accurate result, since the Fourier transform on the final sample gives distortions even when using window functions. To obtain a correlogram of the same length in the calculation method in the time domain, it is necessary to register and process a larger amount of data than when using the calculation method in the spectral domain.
Data encryption and data protection
Revnivykh A.V., Velizhanin A.S. - The method of automated research of the structure of disassembled representation of software code with a buffer overflow vulnerability using the matrix approach pp. 11-30

DOI:
10.25136/2644-5522.2018.6.28288

Abstract: The subject of the research is the optimization algorithms for automated dependency search on disassembled code. The object of the research is the dependent code blocks on the x64 architecture of Intel processors manufactured by the company and listings obtained as a result of reverse engineering software by compilers with different settings in Windows and Linux.Purpose of the study. The purpose of the study is to consider the possibility of using mathematical matrices to build a machine code map, and also to review possible problems for automatic analysis, to search for the paths of information flows. Research methods. We used the Visual C ++ compiler. We consider the architecture in which the transfer of information can be carried out in the following ways: register-memory, memory-register, register-register. For the analysis, the method of forming the list of functions called up to the potentially dangerous block of code under investigation, chosen for each considered path to the block under study, was chosen. Methods for implementing the matrix approach are described and developed. Novelty and key findings. Mathematical matrix method can be used to build a machine code map. However, determining the reachability paths of individual code blocks may require a significant amount of resources. In addition, the machine code can be exposed to packers and obfuscators, which also introduces additional complexity. A number of potentially dangerous functions of the standard library of the C / C ++ programming language were identified.
Question at hand
Tymchuk A.I. - Textural signs in the problem of segmentation of aerial photographs based on luminance dependence matrices pp. 31-39

DOI:
10.25136/2644-5522.2018.6.28395

Abstract: Computer image analysis is an automatic image processing, in the process of which the definition and classification of objects located on the image takes place. One of the most important stages of this analysis is image segmentation, by means of which, based on a set of characteristics (color, texture, brightness, etc.), the initial image is divided into many non-intersecting areas. The importance of the stage lies in the significant impact of segmentation on the final result of the analysis.The object of the research is the method of texture segmentation of the image based on the construction and use of luminance dependence matrices. The subject of the research is the effect of textural features on the quality of image segmentation. Special attention is paid to the calculation of the textural attributes and segmentation evaluation criteria.The research methodology is based on the analysis of texture segmentation of images using empirical evaluation criteria and reference segmentation. The main conclusion of the study is the conclusion about the choice of a set of textural features that showed the best segmentation results. This conclusion was made on the basis of the analysis of the values of the selected criteria for assessing the quality of segmentation. The textural segmentation of images and the evaluation criteria were performed on the basis of the developed program in the C ++ programming language. The novelty of the study is in the analysis of textural characteristics regarding the quality of image segmentation, made on their basis.
Polyanichko M.A. - Using technical indicators to identify insider threats pp. 40-47

DOI:
10.25136/2644-5522.2018.6.27970

Abstract: Detecting insider threats and countering them is a complex task faced by information security experts in both the commercial sector and government organizations. Modern organizations depend on information technology and their information assets, which makes the problem of confronting insiders all the more urgent. Identification of insiders can be carried out by introducing a complex of both technical and organizational measures. The article proposes the use of data from the work logs of information protection software and other monitoring tools to identify insider threats and highlights a set of indicators indicating the presence of suspicious employee actions. The set of technical indicators (indicators) proposed in the article can be used to build a system of logical rules or fuzzy inference rules that allow identifying insiders in an organization. The introduction of mechanisms for analyzing the proposed indicators will improve the efficiency of the information security administrator and will help prevent incidents related to the implementation of insider threats.
Parallel algorithms for numerical analysis
Pekunov V.V. - Application of prediction in parallel processing of chains of predicates in regular-logic expressions pp. 48-55

DOI:
10.25136/2644-5522.2018.6.27986

Abstract: This paper addresses the problem of choosing the execution mode (sequential or parallel) when processing chains of predicates in regular-logic expressions. A brief description of the essence of regular-logical expressions, their known applications (natural language interfaces, automatic parallelizer of C-programs), types and composition of predicate chains is given. Particular attention is paid to the question of the prediction of time spent on processing chains in one mode or another. Various approaches to such a possible prediction are considered in detail. It is noted that in this case the semi-empirical-statistical approach is the most natural. The paper uses the basic relations of the theory of parallel computing, interpolation and extrapolation methods, computational experiment, elements of statistical processing. A new semi-empirical-statistical approach to solving the problem of calculating estimates of the execution time of chains of predicates is proposed. The approach is distinguished by the minimum amount of time measurement achieved using partial recovery of missing data, and the use of potentially more accurate linear autoregressive and quadratic models to calculate the estimated execution time in sequential or parallel modes.
Mathematical models and computer simulation experiment
Sklyar A. - Time series analysis and identification of processes with diffuse periodicity pp. 56-64

DOI:
10.25136/2644-5522.2018.6.27069

Abstract: The subject of research is the method of estimating the noise component in the time series and its removal, the selection of the trend and fluctuations with different periods, the concept of T-ε and T-h-ε almost periods for the final series is introduced. The analysis is based on the requirement of smoothness of a function representing the original data and having derivatives up to the fourth order inclusive and the allocation of almost periods based on functions of the Alter-Johnson type. Separately, the trend of the length of the periods identified in the data of a number of fluctuations. The algorithm for solving the problem is based on minimizing the deviations of the calculated values from the smooth function, provided that the deviations from the source data correspond to the noise level. To identify the oscillatory component and the trend of almost periods, the modified Alter-Johnson function is used. The proposed methodology and algorithms for estimating and eliminating noise in the data allow us to reasonably determine the noise level in the data, remove the noise component from the data, identify almost the periods in the data in the sense of the definitions introduced in the article, highlight the trend and oscillation components in the data, identify, if necessary, the trend of changes almost periods.
Goryachev A.V., Novakova N.E. - Network traffic modeling based on the marker basket algorithm pp. 65-79

DOI:
10.25136/2644-5522.2018.6.27778

Abstract: The object of research in this article is a system for simulating network traffic and its optimization. The subject of research in this article is the marker basket algorithm and methods for optimizing network traffic. Particular attention is paid to the network parameters of special control. We consider the problem of traffic management in order to ensure the quality of network service. Dynamic filter models are proposed based on a marker basket algorithm and a multiplexer that supports network quality control. The task of choosing the optimal strategy for controlling the parameters of traffic filters, working by the marker basket algorithm, is considered. The main research methodology is simulation modeling. Such metaheuristic optimization algorithms such as the genetic algorithm, the harmony algorithm, and the lifting algorithm are investigated. As a result of the research, a mathematical model for assessing the effectiveness of a network site was developed.A simulation and analytical model of network traffic based on the marker basket algorithm has been developed and implemented.The possibilities of several optimization algorithms are analyzed. Conducted simulation experiments, which resulted in the identification of optimal solutions.The study presented can be used to solve problems of improving the quality of network services.
Telecommunication systems and computer networks
Gibadullin R.F. - Organization of secure data transmission in a sensor network based on AVR microcontrollers pp. 80-86

DOI:
10.25136/2644-5522.2018.6.24048

Abstract: The subject of the research is the implementation of the AES encryption algorithm based on AVR microcontrollers to provide secure data transmission in the sensor network. The sensor network is a network technique for the implementation of Ubiquitous computing environment. It is wireless network environment that consists of the many sensors of lightweight and low-power. Though sensor network provides various capabilities, it is unable to ensure the secure authentication between nodes. Eventually it causes the losing reliability of the entire network and many secure problems. Therefore, encryption algorithm for the implementation of reliable sensor network environments is required to the applicable sensor network. In this paper, the author proposes the solution of reliable sensor network to analyze the communication efficiency through measuring performance of AES encryption algorithm by plaintext size, and cost of operation per hop according to the network scale.
Databases
Bodrina N., Sidorov K., Filatova N., Shemaev P. - Software complex for the formation of situationally conditioned patterns of physical signals pp. 87-97

DOI:
10.25136/2644-5522.2018.6.28151

Abstract: The subject of research is the task of creating tools for the formation of information resources with samples of physical signals recorded by a person experiencing an emotional reaction caused by a certain informational stimulus. The results of the analysis of the most well-known national databases with examples of emotional reactions in the patterns of English and French speech, photographs of people, samples of cardiograms, galvanic skin reactions, heart rate and other physical signals are presented. The structure of the new hardware-software complex for the formation and maintenance of an open information resource that integrates examples of recordings of Russian speech with recordings of other physical signals recorded by a person during emotional reactions of a different sign is considered. Conducted field experiments with hardware and software. For the formation of vector patterns of patterns of physical signals used methods of spectral analysis and nonlinear dynamics. The database is developed using systems analysis methods. The new results include the structure of software and information support; features of the methodological support, allowing to register objectively confirmed changes in the emotional state of a person, features of technical support supporting registration of biomedical signals through five channels: video, audio, electroencephalogram, electrocardiogram, electromyogram, as well as the structure and features of an open online version of the multimodal emotion base. The creation and periodic updating of the content of the database of patterns of situational responses makes available to all interested users complete information on each experiment, including recordings of speech and physical signals, as well as data on the methodology of experiments and observation protocols.
Knowledge bases, intelligent systems, expert systems, decision support systems
Fedorova N.I., Klimenteva A.Y. - Information support for decision-making in the formation of a strategy for innovative development of the region pp. 98-109

DOI:
10.25136/2644-5522.2018.6.27399

Abstract: On the basis of a new methodology for assessing the current state of innovative development of the region, a decision-making information support system (DSS) has been developed when developing an innovative development strategy for a region. The generalized structure of the decision support system, the description and purpose of its main modules are given. Approbation of the work of the DSS on the example of the Republic of Bashkortostan was carried out. The received recommendations are necessary for the state authorities of the constituent entity of the Russian Federation to form an effective plan of measures, taking into account the current state and the existing opportunities for innovative development of the region. The study is based on general scientific methods of knowledge.(analysis, synthesis, comparison), the presentation of tabular and graphical interpretation of empirical-factual information. The theoretical and practical significance of the study is due to the relevance of the studied problems of assessment, forecasting and planning of innovative development of territories when developing regional development strategies. The practical result of the research is the approbation and implementation of the proposed approaches in the development of an information decision-making support system necessary for obtaining recommendations on the formation of an effective action plan for an innovative development strategy for a region that takes into account the current state and available capabilities of the territory.
Katasev A.S. - Neuro-fuzzy model of classification rules generation as an effective approximator of objects with discrete output pp. 110-122

DOI:
10.25136/2644-5522.2018.6.28081

Abstract: The subject of this research is to evaluate the effectiveness of the approximation of objects with discrete output based on fuzzy knowledge bases. The object of the research is the neuro-fuzzy model, which allows, based on the training of a fuzzy neural network, to form a system of fuzzy-production rules (a fuzzy knowledge base) for assessing the state of objects. The author examines in detail the type of fuzzy-production rules proposed by him, the algorithm of logical inference on the rules, describes the developed model of a fuzzy neural network. Particular attention is paid to the need to assess the approximating ability of the model in order to determine the feasibility and effectiveness of its practical use. This assessment was made by analyzing the following model characteristics:- convergence of the developed learning algorithm for fuzzy neural network;- satisfaction of its work with the principles of fuzzy approximation;- consistency of the logic inference algorithm on the rules of the model to the well-known algorithm for approximating objects with discrete output based on a fuzzy knowledge base. The estimation of the approximating ability of the neuro-fuzzy model was made, based on the results of which it was concluded that this model is an effective approximator of objects with a discrete output. In addition, in order to test the model, an assessment was made of the classifying ability of the fuzzy rules being formed. The accuracy of classification based on fuzzy rules turned out to be no lower than the accuracy of other known classification methods. The practical value of the application of such rules is the ability to build decision support systems for assessing the state of objects in various subject areas.
Other our sites:
Official Website of NOTA BENE / Aurora Group s.r.o.