Cybernetics and programming
Journal Menu
> Issues > Rubrics > About journal > Authors > About the Journal > Requirements for publication > Council of Editors > List of peer reviewers > Review procedure > Policy of publication. Aims & Scope. > Article retraction > Ethics > Legal information
Journals in science databases
About the Journal

MAIN PAGE > Journal "Cybernetics and programming" > Contents of Issue № 05/2018
Contents of Issue № 05/2018
Question at hand
Chernyshev Y.O., Ventsov N.N., Pshenichnyi I.S. - A possible method of allocating resources in destructive conditions pp. 1-7


Abstract: The subject of research is the approach to the allocation of resources in terms of possible destructive conditions.The object of the research is a model of decision-making processes of a distributional nature under the conditions of possible destructive influences. The authors consider the issues of modeling the processes of resource flow distribution under the conditions of possible undesirable effects. It is shown that the use of relative fuzzy estimates of resource transfer routes is more expedient than modeling the entire resource allocation area in terms of the time complexity of the decision-making process, since, based on statistical and expert assessments, route preferences can be quickly determined from the point of view of guaranteed resource transfer under destructive impacts.The research method is based on the use of set theory, fuzzy logic, evolutionary and immune approaches. The use of fuzzy preference relations reduces the time to build a model, and the use of evolutionary and immune methods to speed up the search for a solution. The main conclusion of the study is the possibility of using relative fuzzy estimates of the preferences of the used routes when organizing the allocation of resources. An algorithm for the allocation of resources in the context of destructive influences is proposed, a distinctive feature of which is the use of information about previously implemented resource allocations in the formation of a set of initial solutions. Verification of the solutions obtained is supposed to be carried out using the method of negative selection - one of the methods of modeling the immune system. Modification of existing solutions is advisable to produce, for example, using the methods of evolutionary modeling.
Parallel algorithms for numerical analysis
Sechenov P., Olennikov A.A. - Application of NVIDIA CUDA parallel programming technology in the task of melting a spherical particle pp. 8-14


Abstract: The article describes the NVIDIA CUDA parallel programming technology used  in the task of melting a spherical particle. The tendency of modern computers to increase power by increasing the number of cores, and not by increasing the frequency of the processor (which leads to significant energy consumption and heat generation). The Amdal law is presented, allowing to estimate the acceleration of the program time when parallelized on N processors. The conditions for increasing the performance of the algorithm in parallelizing tasks are listed. The task of melting iron ore particles is presented. The features of the parallel programming language CUDA C are considered and the algorithms for the selected task are presented. A comparative analysis of the task execution time on the CPU (C #) and GPU (CUDA C) has been made. The technology of parallel programming CUDA allows you to increase the performance of parallelized algorithms of complexity N up to 60 times. This requires the presence of a graphics processor supporting this technology, the development environment and the CUDA compiler, knowledge of the CUDA C language, as well as a good knowledge of the task and the possibility of its parallelization.
Methods, languages and forms of human-computer interaction
Kukushkin Y.A., Aivazyan S.A. - Methods of automated processing control movements of the operator in applied research the reliability of ergonomics systems pp. 15-23


Abstract: The subject of the study is to optimize systems management ergatic laws based psychophysiological operator capabilities. The complexity of the analysis of control movements of the operator is determined by the fact that the processes of manipulating the body ergatic control system are non-stationary, so you need a digital filtering techniques to eliminate the process of being analyzed low-frequency trend and move on to the analysis of a stationary random process. The technique of the automated processing control operator movements in applied research the reliability of ergonomics systems, test during ergonomic research on HIL simulator complex. The research methodology combines the techniques of engineering psychology, theory of reliability, ergonomics, spectral analysis, mathematical cybernetics and computational mathematics. The main conclusion of the study is that the analysis of the motions of the control operator must be an integral part of HIL ergonomic research because its results allow for adequate consideration of psychophysiological state and the reserve capacity of the system operator ergatic in managing its operation. This allows you to develop and implement a set of measures aimed at ensuring the proper functional reliability of professional activity ergatic system operator to ensure the safety of its operation.
Systems analysis , search, analysis and information filtering
Myasnikov V.I. - Generation algorithm for high-precision PWM signal pp. 24-31


Abstract: The object of the study are regulators, built on the basis of pulse-width modulation (PWM). The popularity of regulation using PWM is explained by the simplicity of its implementation, the absence of hysteresis in the executive electromagnetic device. The bit width and frequency of the PWM signal affect the quality of regulation. Modern microcontrollers have an integrated PWM module, so the implementation of the control device on them is simple and inexpensive. Given the parameters of the PWM signal - frequency and bit depth, the developer often faces difficulty in implementing the controller due to the limited resources of the microcontroller, in particular, the clock frequency. The paper analyzes the specified N-bit PWM by splitting it into two components, one of which is determined by the frequency and width of the PWM processor. The dependence of the required memory size is investigated depending on the algorithm of splitting into components, as well as on the bit depth of the microcontroller used. As a result of the research, the memory costs for a tabular method for generating a PWM signal depending on the width of the microcontroller used are determined. The main result of the work is the possibility of increasing the resolution of the controller with limited microcontroller resources. Recommendations on the implementation of the algorithm depending on the selected microcontroller are given, the required memory resources are given.
Automated systems of engineering process administration
Fedosovsky M.E. - Development of a methodology for constructing control systems for complex technical complexes using the methods of mathematical category theory pp. 32-43


Abstract: The object of research in this work are the control systems of complex technical complexes. The subject of research is the methodology for developing control systems for complex technical complexes. The developed methodology for creating a control system for complex technical complexes is based on the idea of generating a sequence of mappings of conceptual models into infological models and, further, into datalogical models. Previously, the author presented conceptual and infological modeling and the mathematical models corresponding to these levels, as well as the relations between them, that is, mathematical categories. The developed methodology for creating a control system for complex technical complexes is based on the methods of the theory of mathematical categories. The categories presented in the datalogical representation have two levels of abstraction. The main findings of the study:1. A unified description of families of inhomogeneous mathematical models reflecting a different level of abstraction (generalization) at the stage of the datalogical presentation of subject problems makes it possible to create formulations for the general definition of models with a description of their structure.2. The developed method of datalogical modeling provides all the possibilities for providing customization on specific software and hardware tools for implementing a control system for complex technical complexes.
Raikhlin V.A., Minyazev R.S., Klassen R.K. - The efficiency of a large conservative type DBMS on a cluster platform pp. 44-62


Abstract: The results of original research on the principles of organization and features of the operation of conservative DBMS of cluster type are discussed. The relevance of the adopted orientation to work with large-scale databases is determined by modern trends in the intellectual processing of large information arrays. Increasing the volume of databases requires them to be hashed over cluster nodes. This necessitates the use of a regular query processing plan with dynamic segmentation of intermediate and temporary relationships. A comparative evaluation of the results obtained with the alternative "core-to-query" approach is provided, provided that the database is replicated across cluster nodes. A significant place in the article is occupied by a theoretical analysis of GPU-accelerations with respect to conservative DBMS with a regular query processing plan. Experimental studies were carried out on specially developed full-scale models - Clusterix, Clusterix-M, PerformSys with MySQL at the executive level. Theoretical analysis of the GPU-accelerations is performed using the example of the proposed project Clusterix-G. The following are shown: the peculiarities of the behavior of the Clusterix DBMS in dynamics and the optimal architectural variant of the system; Increased "many times" scalability and system performance in the transition to multiclustering (DBMS Clusterix-M) or to the advanced technology "core-to-query" (PerformSys); Non-competitiveness of GPU-acceleration in comparison with the "core-to-query" approach for medium-sized databases that do not exceed the size of the cluster's memory, but do not fit into the GPU's global memory. For large-scale databases, a hybrid technology (the Clusterix-G project) is proposed with the cluster divided into two parts. One of them performs selection and projection over a hashed by nodes and a compressed database. The other is a core-to-query connection. Functions of GPU accelerators in different parts are peculiar. Theoretical analysis showed greater effectiveness of such technology in comparison with Clusterix-M. But the question of the advisability of using graphic accelerators within this architecture requires further experimental research. It is noted that the Clusterix-M project remains viable in the Big Data field. Similarly - with the "core-to-query" approach with the availability of modern expensive information technologies.
Lobanov A.A., Filgus D.I. - The method of searching for the shortest Hamiltonian path in an arbitrary graph based on the rank approach, which provides high efficiency and small error in solving the problem of organizing the process of managing multiple transactions and queries when they are implemented in network databases pp. 63-75


Abstract: The object of research is the workload implementation management subsystem in a network database. The subject of research is the management of the formation of the schedule for the implementation of subscriber requests and transactions in a network database. In many cases, existing solutions do not provide the necessary results in terms of access time and accuracy of the found solution. There is a need to develop a method for scheduling the implementation of user and transaction requests. Particular attention is paid to the algorithms of sampling queries in network databases, as well as the conceptual model of the process of managing transactions and queries. We use methods of graph theory. The evaluation of the effectiveness of the task solution was performed using a systems approach, system analysis and the theory of operations research. Processing of experimental data obtained during the work was carried out in accordance with the provisions of mathematical statistics. A method has been developed for finding the shortest Hamiltonian path in an arbitrary graph based on a rank approach, which provides high efficiency and small error in solving the problem of organizing the process of managing multiple transactions and queries when they are implemented in network databases. Using the developed method allows minimizing idle time of computing devices, reducing the volume and time of data transfer from one device to another, increases overall scalability, minimizes data access time, etc. An important advantage of the proposed method is to reduce the number of elementary operations and the number of vectors being processed the queue of the operations of the request, which leads to a significant reduction in time to implement the procedures for the formation of echer di operations in the requests.
Data encryption and data protection
Baltaev R.K., Lunegov I.V. - Steganographic method of embedding information using a noise-like sequence and preserving the statistical model of images pp. 76-83


Abstract: The subject of research is the steganographic method of embedding information in digital images. Steganography is capable of hiding not only the content of information, but also the very fact of its existence. The paper considers one of the most important problems in the development of steganographic methods - the secrecy of the transfer of protected information. Stealth is not only visual or auditory indistinguishability of a digital media resource from a media resource with embedded information, but also statistical indistinguishability. Special attention is paid to preserving the spatial statistical dependence between the image pixels. The methodological basis of the research is the methods of mathematical statistics and image processing theory, as well as image distortion metrics. The novelty of the research lies in the development of a new method of embedding information in static images. The authors consider in detail the problem of applying the moving average autoregression process to represent the statistical dependence of image pixels. It is shown that the proposed method allows you to embed information into digital images without significant distortion.
Other our sites:
Official Website of NOTA BENE / Aurora Group s.r.o.
"History Illustrated" Website