MAIN PAGE
> Journal "Cybernetics and programming"
> Rubric "Software for innovative information technologies"
Software for innovative information technologies 
Borodin A.V., Azarova A.N.  Methods of classification and dimensionality reduction in case of visualization of performance metrics


pp. 135

DOI: 10.7256/23064196.2015.4.15271
Abstract: The paper deals with the methodology of an assessment of technical efficiency of network infrastructure. Much attention is given to research of methods of visualization of performance metrics on the basis of comparing of the evaluated sample with a set of alternative decisions in the conditions of stochastic nature of behavior of an external environment. The method of visualization of time response characteristics of access to resources of the Internet offered in operation is developed especially for demonstration of advantages which can be received when using the concept of "cognitive Internet". Unlike numerical efficiency characteristics the offered method of visualization allows to envelop "one look" a status of all channels of access in comparison with the optimum channel on the given time slot of integration. At the same time the method doesn't exclude possibility of sharing of the coordinated numerical efficiency characteristics. On the other hand it is important to mark that scope of a method isn't restricted to the specified applications.Methods of multivariate statistic analysis (methods of discriminant function analysis and principal component analysis) are the basis for algorithm elaboration of visualization of time response characteristics of access to resources of the Internet. The main result of the conducted research is algorithm elaboration and the software of visualization of metrics of productivity of infrastructure decisions in the field of ensuring access to Internet resources. Novelty of this research is defined not only novelty of data domain (technology of the cognitive Internet), but also the form of representation of results (a projection of the hodograph of time response characteristics of access to the most informative plane).
Malashkevich V.B., Malashkevich I.A.  Elements of algebra of triplexes in idempotent bases


pp. 1228

DOI: 10.7256/23064196.2016.1.17583
Abstract: The subject of study in algebra is a ternary (threedimensional) hypercomplex numbers (triplexes). Since the time of Hamilton (1983) algebras of hypercomplex numbers attracted the attention of researchers. The largest number of papers in this area is dedicated to quaternion algebra and bicomplex numbers, as well as its applications to the solution of various problems of science and technology. Algebra of ternary (threedimensional) hypercomplex numbers is less studied. However, it is undoubtedly promising in solving problems related to processing of point objects and fields in threedimensional Euclidean space. The main objective of the article is forming a basis of idempotent algebra of threehypercomplex numbers. Idempotent bases are typical for commutative multiplicative algebras without division. Such bases provide a simple definition and way of studying mathematical constructions of hypercomplex numbers as well as a significant increase in computational efficiency. The paper presents all possible unit vectors of potential idempotent bases of triplex numbers. The authors highlight two idempotent bases providing not excessive presentation of triplexes. The main attention is given the study of one of these bases with complex unit vectors. The paper shows, that idempotent triplexes basis allows formulating the definition of arithmetic operations and triplex argument functions in terms of the wellstudied algebra of real and complex numbers. At the same time mentioned basis provides a high computational efficiency for calculating values of these operations and functions.
Milovanov M.M.  Using Windows PowerShell scripts to manage Microsoft SQL Server backups in application for Department of Social Security


pp. 710

DOI: 10.7256/23064196.2015.3.15410
Abstract: At the present time it is very important to save gathered data. The development of modern information technologies, using databases raises questions of storing and backups for big amount of data. The high requirements to the speed of data recovery imply the correct organization of storing data and backups. Keeping that in mind the author shares his experience of setting up making of backups in command line using Windows PowerShell scripts. The article describes mechanisms and results of the applied technique. The research is focused on the observing ITprocesses. The article reviews usage of upgraded Windows PowerShell command line instead of outdated command line. The author presents a short review of the main commands used in writing a PowerShell script. The article gives examples of using the developed script for backing up and storing databases. Using this technique proves to be reliable for a long time based on the tests on different platforms. This method increases the efficiency of IT, the reliability of information processes and optimization of employee time.
Poryadin A.E., Oparin K.S.  Nonparametric model of learning for a system of diagnostics of psycho physiological qualities


pp. 1319

DOI: 10.7256/23064196.2016.2.18155
Abstract: The article studies support systems in field of diagnostics of person psychophysiological qualities. The subject of the research is the use of neural networks in the development of tests for evaluation of the psychophysiological state of a person. In this paper the authors examine the possibility of using neural networks to assess the psychophysiological state of a person applying the achievements obtained by other researchers using neural networks to solve the problems of medical diagnostic, such as in the diagnosis of myocardial infarction or in recognition of emotions based on psycho physiological parameters. The authors used mathematical modeling methods, such as methods of probability theory, mathematical statistics, artificial intelligence, methods of forecasting and decisionmaking. The study shows that neural network is an effective tool for the study of such stochastic systems as human. Using neural networks in systems of psycho physiological diagnosis improves the accuracy of diagnosis by uncovering hidden relationships between different human systems. The ability to use neural networks for the treatment of psychophysiological test results was confirmed using a generalized description of a neural network and examples of input and output neural network vectors for processing results of the test а reaction to moving object.
Perminova M.Yu.  The analysis of partitions based algorithm of polynomials decomposition


pp. 2134

DOI: 10.7256/23064196.2015.6.17169
Abstract: The research focuses on the generating function, which is an effective tool for solving various mathematical problems in combinatorics, probability theory, mathematical physics, analysis of algorithms, etc. The subject of research is one class of generating functions – polynomials. Special attention is paid to the problem of polynomial decomposition, which has a number of solutions. The author proposes a new polynomial decomposition algorithm based on partitions. The article gives a brief description of the algorithm and gives an example of its usage. The study determines computational complexity of the algorithm, which consists of the time complexity of generating partitions, producing a monomial and solving the equation. The time complexity of the polynomial decomposition algorithm based on partitions is calculated on the basis of the results obtained by D. Knuth and given in the OnLine Encyclopedia of Integer Sequences. The original polynomial decomposition algorithm is also given. It is shown that the time complexity of the algorithm is O (n^2). The author compares the described algorithm with its analogs. The analysis shows that most of the decomposition algorithms have polynomial computational complexity of O (n^2). The experimental curves of the computational complexity of the polynomial decomposition algorithm based on partitions and known algorithms are shown.
Fukin I.A.  Cloud management system for communication between educational institutions and employers


pp. 2537

Abstract: Creating conditions for effective process of interaction between subjects of educational cluster in the current institutional environment requires the formation of an information space of the participants. The need for complex algorithms and solving problems of information support of the interaction of subjects of educational clusters due to the complexity of control as learning process consists in the fact that the assessment of the quality of management and adjusting curricula, load distribution, class schedules are possible only after the completion of a certain cycle of learning, and a single information environment this segment of the labor market. In the article the problem of interaction between enterprises and educational institutions for training is reviewed. As a tool to deal with them author suggests to use the control subsystem interaction of stakeholders in education and labor markets control systems of educational process based on cloud technologies. Proposed solution is presented as a software module.
Borodin A.V., Biryukov E.S.  The practical implementation of some algorithms related to the problem of number composing


pp. 2745

DOI: 10.7256/23064196.2015.1.13734
Abstract: Among combinatorial algorithms of additive number theory the algorithms of the algorithms for listing compositions of natural numbers have a special place. On the one hand, ideologically, they are among the simplest algorithms in mentioned theory. On the other hand, they play a huge role in all applications somehow connected with the polynomial theorem. In recent years, due to the rapid development of the general theory of risk ideas underlying the polynomial theorem were involved to in the challenges of risk measurement in homogeneous systems of high dimensionality. Solving these problems requires providing mass listing compositions numbers of fixed length and calculating the amount of such compositions for sufficiently large values of both number and the length of composition. In these circumstances, the most urgent task is in effective implementation of these algorithms. The presented article is devoted to the questions related with the synthesis of efficient algorithms for listing the compositions of fixed length and calculating the amount of such compositions. As a methodological base of this study authors use certain facts of set theory, approaches of theory of complex algorithms, as well as some basic results of the theory of numbers. Within this paper, the author propose a new efficient implementation of two algorithms: algorithm for listing all the compositions of fixed length based on the idea of multiset representation of the number partitions and algorithm for calculating the amounts of the compositions of given kind, implemented without involvement of high bitness machine arithmetic. The article shows not only an estimate of the complexity of the proposed algorithms but also presents the results of numerical experiments demonstrating the effectiveness of the implementation of the algorithms discussed in the VBA programming language.
Ponomarev D.Yu.  Software load sharing system for information systems


pp. 2936

DOI: 10.7256/23064196.2013.5.9762
Abstract: The article presents the results of the development of software for calculating load distribution in information systems using tensor methodology. Applying tensor models allows solving the task for a wide range of information networks. The article states that for the purpose of the application of tensor analysis to the problem of analyzing the distribution of traffic information network a software system that implements certain stages of network analysis was developed. The author notes that the used mathematical apparatus is well formalized and it is possible to solve the problem of the implementation of the tensor methodology in software system using the available software tools. As an example of the developed software the author presents a research on the distribution of traffic on the network. In conclusion, it is stated that the further analysis of the values of the intensity distribution of load requires defining the type information distribution systems (lossy or expectation), and determining the required number of lines for a given level of losses.
Urazaeva T.A.  Application package “MultiMIR”: architecture and appliance


pp. 3461

DOI: 10.7256/23064196.2014.5.12962
Abstract: Evaluation of risks of system development is an urgent task for a for a variety of disciplines such as economics and sociology, technology and ecology, the system studied at the intersection of different disciplines. Often the parameters of such systems are discrete, set of possible states is bounded. The application package “MultiMIR” was designed to evaluate risks of development in such systems. An important difference of “MultiMIR” from other application is in achievement of polynomial computational complexity for some classes of systems, while most analogues offer only exponential complexity. The article describes: purpose of the application, main ideas used as a basis for algorithms, application architecture. The author gives an overview of ways of using the application. The conceptual basis of the theory used in the development of algorithms implemented in “MultiMIR” is in theoretical probabilistic approach. As a specific mathematical apparatus the author has chosen formalism of theory of multisets, which, in author’s opinion, has the richest expressive possibilities for the study in the described the subject area. As a programming system used in the development of the first version of the application the author used VBAsubsystem office with Microsoft Office. The selection of the programming system is dictated by the features and preferences of the primary target of the package: banking and financial analysts. Using “MultiMIR” allowed for the first time to provide accurate calculation of such nonlinear measures of risk as expected utility, distorted probability measure, "Value at Risk", and so on for medium and large homogeneous portfolios term financial instruments without involving timeconsuming analytical methods. Unlike traditionally used for this purpose Monte Carlo method, approached based on the described above application allows obtaining an exact solution using a comparable amount of CPU resource. “MultiMIR” application can also be used for verification of reliability of the results obtained using Monte Carlo methods considered classical in the financial risk management.
Milovanov M.M.  Software implementation of a workbench for testing trading algorithm based on the project approach


pp. 229235

DOI: 10.7256/23064196.2016.1.17855
Abstract: The article describes an approach to the development and implementation of a software system for the design, testing and implementation of trading algorithms. The author shows methods of obtaining and transmitting data from a medium to the terminal and vice versa. The article presents a review of similar software and highlights the main advantages of this approach to design. The author uses a prototypebased programming approach to software implementation. As a method of research the author uses observation. The object of research is the algorithm. The subject of the study is a data set for analysis of the algorithm. The main novelty of the proposed approach is to use prototypebased programming approach to algorithm design and implementation used as an objectoriented model. The article suggests a scheme of data exchange with thirdparty applications using functions of native dynamic libraries of a terminal. The study gives an algorithm for the application.
