Национальный цифровой ресурс Руконт - межотраслевая электронная библиотека (ЭБС) на базе технологии Контекстум (всего произведений: 634699)
Контекстум
.

Информатика (190,00 руб.)

0   0
Первый авторЛебедев Виктор Иванович
Издательствоизд-во СКФУ
Страниц102
ID314115
АннотацияКурс лекций разработан в соответствии с требованиями ФГОС ВПО. Рассматриваются основы информатики и информационных технологий, структура и функционирование информационных сетей. Большое внимание уделяется Интернет-технологиям, тенденциям развития информационных технологий. Курс лекций направлен на формирование набора общих культурных и профессиональных знаний будущего бакалавра в техническом и гуманитарном направлениях подготовки. Рекомендуется для студентов гуманитарных и инженерных специальностей, а также для студентов, аспирантов, преподавателей, изучающих основы информатики и информационных технологий.
Кому рекомендованоРекомендуется для студентов гуманитарных и инженерных специальностей, а также для студентов, аспирантов, преподавателей, изучающих основы информатики и информационных технологий.
УДК004.94
ББК32.81
Лебедев, В. И. Информатика : курс лекций на английском языке : Направление подготовки 08.03.01.62 – Строительство / В. И. Лебедев .— Ставрополь : изд-во СКФУ, 2015 .— 102 с. — Библиогр.: с. 97-98 .— URL: https://rucont.ru/efd/314115 (дата обращения: 25.04.2024)

Предпросмотр (выдержки из произведения)

In the given manual bases of computer science and information technologies, structures and functioning of information networks, computer safeties are considered. <...> The signal represents any process bearing the information. <...> The message is the information presented in the certain form and intended for transfer. <...> The crux of this information theory, originally developed to deal with the efficiency of information transmission in electronic channels, is the definition of an information quantity that can be measured. <...> The price to pay for the ability to objectively measure such a quantity is that it does not deal at all with the subjective aspects of information, namely semantics and pragmatics. <...> Indeed, information is defined as a quantity that depends on symbol manipulation alone. <...> Information Measurement Bit is the information quantity necessary for unequivocal definition of one of the two equiprobable events. 4 If bit is a minimum unit of information, byte is its basic unit. <...> SUMMARY The processes which are carrying out gathering, transfer, processing and accumulation of information are called information processes. <...> Information gathering is a process of reception of the external information and its reduction to a kind, standard for the given information system. <...> The information transfer is a process in which the source of the information transfers it, and the addressee – accepts. <...> Some means of gathering and registration collect the information automatically and transfer it in the COMPUTER. <...> Information processing is the ordered process of its transformation according to the problem solution algorithm. <...> Thus the coding system provides unequivocal definition of 16,5 million various colors, that is actually close to sensitivity of the human eye. <...> Paying a tribute to that phenomenon which is often called «computer revolution», and defining the prospects of computer development, in everyday life we concentrate on "personal calculations" and "business prose" processing phenomena. <...> They are closely connected with personal computer development, automated workplaces, software specificity and text processing, electronic editing in particular. <...> But their structures are based on the universal logical principles which allow distinguishing the following main devices found in any computer: – memory (STORAGE), consisting of enumerated cells; – processor, including a control unit (CU) and the arithmetic logic unit (ALU); – input equipment; – output equipment. <...> Processor functions: – data <...>
Информатика.pdf
Стр.1
Стр.2
Стр.3
Стр.4
Стр.5
Информатика.pdf
МИНИCTEPCTBO ОБРАЗОВАНИЯ И НАУКИ РОССИЙСКОЙ ФЕДЕРАЦИИ ФЕДЕРАЛЬНОЕ ГОСУДАРСТВЕННОЕ АВТОНОМНОЕ ОБРАЗОВАТЕЛЬНОЕ УЧРЕЖДЕНИЕ ВЫСШЕГО ПРОФЕССИОНАЛЬНОГО ОБРАЗОВАНИЯ «СЕВЕРО-КАВКАЗСКИЙ ФЕДЕРАЛЬНЫЙ УНИВЕРСИТЕТ» V. I. Lebedev INFORMATICS COURSE OF LECTURES IN ENGLISH В. И. Лебедев ИНФОРМАТИКА КУРС ЛЕКЦИЙ НА АНГЛИЙСКОМ ЯЗЫКЕ Направление подготовки 08.03.01.62 – Строительство Ставрополь 2015
Стр.1
УДК 004.94(075.8) ББК 32.81 я 73 И 74 It is printed under the decision methodical council The North The North Caucasian federal University Reviewers: doctor of economical sciences, professor A. V. Shuvaev, candidate of pedagogical sciences, the senior lecturer of Computer sciences chair I. P. Khvostova Лебедев В. И. И 74 Информатика: курс лекций на английском языку. – Ставрополь: Изд-во СКФУ, 2015. – 102 с. Lebedev V. I. И 74 Informatics: сourse of lectures in English. – Stavropol: Publisher NCSU, 2015. – 102 p. The manual is made according to requirements of the Federal state educational standard of the higher vocational training of the Russian Federation. In the given manual bases of computer science and information technologies, structures and functioning of information networks, computer safeties are considered. The big attention is given to Internet-technology. The basic data on tendencies of computer technics, information technologies development contain. The manual is directed on formation of a set common cultural and professional knowledge the future bachelor in technical and humanitarian directions of preparation. It is recommended for students of humanitarian and engineering specialties of NCSU, and also for students, post-graduate students, the teachers studying informatics and information technologies. © North Caucasian federal university, 2015 2
Стр.2
1. COMPUTER SCIECE AND INFORMATION PROPERTIES INTRODUCTION Lecture Outline 1.1. Computer science and main definitions. 1.2. Information measurement. 1.3. Analog information versus digital information. 1.4. Information processes in nature: codes. 1.5. Information properties. The word information has been used to signify knowledge and aspects of cognition such as meaning, instruction, communication, representation, signs, symbols, etc. The Oxford English Dictionary defines information as “the action of informing; formation or molding of the mind or character, training, instruction, teaching; communication of instructive knowledge”. The most outstanding achievements of the twentieth century were the invention of computers and a new understanding of the concept of information itself. Furthermore, modern science is unraveling the nature of information in numerous areas such as communication theory, biology, neuroscience, cognitive science, education and others. 1.1. Computer Science and main Definitions The computer science is the area of human activity connected with processes of information transformation by means of computers and their interaction with the field of application. The information is data on objects and phenomena, their parameters, properties and conditions which reduce the degree of uncertainty, incompleteness of our knowledge about them. Concepts connected with the concept of information are: the signal, the message and the data. The signal represents any process bearing the information. The message is the information presented in the certain form and intended for transfer. Data is the information presented in the formalized kind and intended for processing by its means, for example, computation. 3
Стр.3
Information became a prominent word and notion in the article published in 1948 by Claude Shannon. However, the word information did not figure in the title, which was “The mathematical theory of communication”, even though it became known as the Shannon Information Theory. The crux of this information theory, originally developed to deal with the efficiency of information transmission in electronic channels, is the definition of an information quantity that can be measured. Such analysis of information is concerned with the discovery of the elementary particles or units of information. The price to pay for the ability to objectively measure such a quantity is that it does not deal at all with the subjective aspects of information, namely semantics and pragmatics. Indeed, information is defined as a quantity that depends on symbol manipulation alone. Since information content depends on the language used, Shannon needed to compute information content on the most economical symbol system available, which he proved to be the binary system. Since the binary system of encoding messages using only two symbols, typically “0” and “1”, is the most economical, to measure information content, Shannon's theory demands encoding every message using the binary system, and then counting alternative choices in this system. The most elementary choice one can make is between two items: “0' and “1”, “heads” or “tails”, “true” or “false”, etc. Shannon defined the bit as such an elementary choice, or unit of information content, on which all selection operations are built. Bit is short for binary digit and is equivalent to the choice between two equally likely choices. 1.2. Information Measurement Bit is the information quantity necessary for unequivocal definition of one of the two equiprobable events. 4
Стр.4
If bit is a minimum unit of information, byte is its basic unit. The group of 8 bits of the information is called byte. 1 Kb = 1024 bytes = 210 bytes. 1 Mb = 1024 Kb = 220 bytes. 1 Gbytes = 1024 Mb = 230 bytes. 1 Tbytes = 1024 Gbytes = 240 bytes. And so on. The information quantity is the numerical characteristic of a signal reflecting those degrees of uncertainty (incompleteness of knowledge), which disappear after the message reception in the form of the given signal. This measurement of information uncertainty is called entropy, and the method of information quantity measurement is called statistical. If as a result of the message reception, full clearness in any question is reached, one can say that the full or exhaustive information has been received and there is no need for the reception of additional information. And, on the contrary, if after the message reception the uncertainty remains unchanged, it means the information has not been received (zero information). The discursion shows, that the concepts of the information, uncertainty and choice possibility are closely connected. So, any uncertainty assumes choice possibility, and any information, reducing uncertainty, also reduces choice possibility. In case of the full information, there is no choice. The partial information reduces the number of choice variants, thereby reducing uncertainty. Equiprobable events. Any system is characterized by the conditions, which appear as a result of certain events. We will consider equiprobable events or conditions, for example heads or tails losing when flipping a coin. Hartley’s Formula defines the information quantity I (in bits) for a number of possible equiprobable events N as follows: I = log 2 N. known, apply the inverse formula: N = 2I. (1) To determine possible events number if the information quantity is (2) Events that are not equiprobable. If as a result of the event, there are different probabilities, the events are not equiprobable. For example, if one of the coin sides is heavier, it will fall with this side down more often. 5
Стр.5