I Think Where The Technology Is Heading

0 51
Avatar for janatyler
3 years ago

Let's start with the definition of a few technologies first. Artificial intelligence is defined as the type of intelligence introduced by machines. Artificial intelligence-based computer software uses a number of computational (informatics) techniques to reveal similar approaches to human thinking and decision-making processes. Although artificial intelligence seems to match the natural intelligence of living things, these two are different from each other.

The increasing need for computing with the proliferation of data sources and produced data in the world forces new developments in the field of computers. Faster and more capable IT tools are being developed constantly. Meanwhile, the search for new ways to meet information needs continues. Advances in electronics are making computer chips and cards getting smaller.

This new situation creates the possibility that almost any physical object can act like a computer. That is, by embedding the ever-smaller computer card into an object, it is possible to transform it into a smart-connected object that can process and connect to the Internet via networks. As a result, this creates a new new ecosystem in which almost every object is a computer. In addition, we see that such objects located on a network or connected to the Internet can perform computational tasks by sharing among themselves. Some of these tasks are already beginning to be aimed at performing artificial intelligence functions.

As the information needs increase, it becomes a necessity to obtain the calculation results faster. If the slowness of the networks or the Internet is delaying the results, a solution must be found. Speed ​​is indispensable, especially for real-time systems. In addition, it may not be necessary to reach remote points of the Internet network for every transaction. This situation; It brings to the agenda an approach which is technically called "edge informatics" and in daily language "close informatics".

Edge computing is a distributed computing approach that brings computing and data storage closer to where it is needed to improve response times and save Internet bandwidth. Edge computing is any type of computer program that is closer to requests and provides less latency. If such programs are implementing artificial intelligence applications, then this type of machine intelligence is called "extreme artificial intelligence (near artificial intelligence)". In short; Artificial intelligence is getting closer to human and life with new information devices and software developments.

A sensor is a device, module, machine or subsystem whose purpose is to detect events or changes in its environment and send information to other electronic devices (eg a computer processor). Sensors are always used with other electronic devices. Physical, chemical, etc. from the environment. Sensors are used to collect data in quantitative areas such as. Sensors play an important role in the increase in the variety and amount of data emerging today. The technologies I mentioned above are used to process this data. Production and service businesses and factories are at the top of the areas where sensors are becoming more and more available. This situation requires the change and transformation of information technologies in production spaces.

After all, we are faced with an appearance like this. Edge (near) computing and edge (near) artificial intelligence applications will become increasingly widespread, especially in production environments where the data that needs to be processed diversify and multiply.

One of the benefits of edge computing is that instead of sending data for processing to remote locations in the cloud (ie Internet servers), it conserves bandwidth and improves productivity by processing it closer to the users and devices that require the data. Manufacturers can reduce latency issues by placing AI locally; It can reduce the use and cost of cloud services while accelerating situation insight and decision making. Processing some of the data locally reduces bandwidth and cellular data usage, so the connection cost decreases. Because the computing is run locally, facilities located in remote areas with poor communication infrastructure are less affected by connectivity losses that can interfere with mission-critical and time-sensitive decision making.

With the development of small computer chips and cards, it has been possible to make any object smart and connected. Thanks to the hardware and software embedded in the object, smart-connected objects can be connected to networks and the Internet while processing data. Sensors associated with these objects enable them to obtain information about the objects themselves and the outside world. In the short term, the number of objects that can be connected to the Internet will reach as high as 50 billion. So, connected objects will increasingly use the Internet more and more than people do. This system formed by these connected objects is called the Internet of Things.

The increase in the number of people and objects using the Internet and the proliferation of software applications used in this environment mean new virtual traffic problems. New technologies need to be developed to overcome this and make the Internet more fluid. The fifth generation mobile network 5G is coming to the fore as one of the solutions in this matter. Thanks to this new global wireless connectivity standard, the Internet of Things will find its true place and meaning.

5G, which will be able to meet the needs of millions of connections with applications based largely on data from sensors, will support economic sectors with new and stronger digital capabilities. 5G, which can reach a speed of 10 Gbps, will be approximately 100 times faster than the current 4G standard. So 5G; It will make it possible to share data extremely quickly, minimize processing delays and the ability of factory systems to react in real time. The reliability of 5G connectivity will ensure a stable and stable network connection on the factory floors anywhere and anytime. In this context, it will ensure that critical business tasks are carried out uninterrupted and unimpeded. Thanks to 5G, we can now consider it a possible future to enable large volumes of data communication between machines. Undoubtedly, artificial intelligence will play an important role in this fluid environment created by 5G.

Among the important advantages that 5G will bring, we can say that the cabling needs will be eliminated to a great extent. In addition to the wireless connectivity capability of 5G, mobile autonomous robots will make it possible to realize truly flexible factories - with production lines that can be easily converted. In addition, with the use of artificial intelligence in remote image recognition, the machine park will be able to monitor the production environment and therefore make predictive maintenance. When 5G and augmented reality come together, increased assembly productivity and quality in production areas will be further results.

In the coming period, the number of smart mobile phones equipped with new capabilities will continue to increase. One of the innovations will be the development of these phones with various sensors. Thus, mobile phones will be available for more different applications. Among these, we can say that usage-based contracts and tariffs will emerge. Companies will apply different tariffs to their customers with real-time data they collect through smartphones. For example, insurance companies will use sensors and tracking technologies built into smartphones to gather real-time data and better understand their customers' driving habits. So this will give insurers the opportunity to offer more behavioral insurance programs.

Banks will experience yet another technological application of the next period. Artificial intelligence will play an important role in applications whose examples are already beginning to be seen. It is predicted that banks will make serious investments in artificial intelligence applications in the coming period. Artificial intelligence applications for banks seem to be a value-added application area for software developers.

The years 2020 and 2021 have important features in terms of world technology history… That is, due to the Covid-19 outbreak (including smartphones, tablets and portable computers), there was a great increase in the use of computer devices and the Internet. On the other hand, interest in social media has increased. Apart from ordinary commercial practices and personal use, the training continued in a virtual environment, ensuring the sustainability of turnovers in the IT sector through cargo delivery.

On the other hand, another interesting development was brought to the agenda by a statement by the international criminal police organization Interpol. According to Interpol, the crisis situation created by the Covid-19 epidemic has caused a large increase in cybercrime and cyber-hacker attacks. In this context, according to the research conducted by a global management consultancy company in 2020, the performance of companies working in the field of cyber security is exaggerated. Hackers do not have much trouble breaching cyber firewalls.

Cyber ​​security has become a more important issue in this age when information, both corporate and personal - including intimate ones - is transported to networks and the Internet. For example, while the Internet of Things provides great convenience, it also becomes one of the attack targets of cyber hackers. In this context, the need for protection of information systems and networks in terms of both hardware and software is increasing. In addition, cyber security has to become an industrial design criterion for smart-connected objects. When we take it at an enterprise scale; Identifying common information-communication (ICT) security weaknesses and improving cyber security maturity is indispensable for truly digitizing organizations.

You must have a username and password to log into an information system. These serve to verify that you are the authorized user of the system. Information on such verification rights is contained within a certain structure of the information system - commonly called the active directory infrastructure. When cyber hackers intend to attack a system, their targets include this infrastructure, which includes authentication rights and access authorizations. Hackers who gain access to this infrastructure can reach many points, from smartphones of senior executives to the company's databases - containing privacy information. The damage to be caused by access is not only at the information level; Hackers can also damage computing equipment. When you look from the home perspective; Your home appliance, for example, your smart-connected combi boiler or refrigerator is also at risk.

The cyber security field continues to be a growing business sector depending on the technologies in information and communication. The virtual attacks observed so far demonstrate the seriousness of the issue with examples. Therefore, it is inevitable for companies to invest in monitoring cyber attacks in real time and taking security measures for themselves and their customers.

The most important agenda item that the Covid-19 epidemic put in front of the world was health. Accordingly, the value of health data has once again emerged. The rapid collection of health data offers the industry unprecedented opportunities to leverage groundbreaking digital capabilities such as artificial intelligence to improve treatment. Intelligent use of health data has the potential to significantly improve patient treatment and care. Thanks to such data, it will be possible to diagnose the disease correctly, to apply rapid treatment, to reduce the time spent in the health institution and to minimize the use of drugs.

The increase in health sector initiatives in technoparks also reveals the interesting dimensions of the issue. There is also an increase in the interest of enterprises that develop hardware and software in the field of health towards artificial intelligence.

Despite such technological advances, we should not neglect the importance of the workforce. Global surveys predict that by 2025, Generation Y will make up three-quarters of the global workforce. Businesses can increase the performance of their human resource (HR) by using technology to develop an innovative recruiting process.

Some startups are offering next generation recruiting solutions using AI software. Sprout startups leveraging cognitive science and gamification expertise to create mini games for smartphones, tablets and computers; It develops applications that smoothly evaluate technical expertise, intellectual and social skills, cultural appropriateness, and many other factors in the field of HR.

The way technology has taken is not just these. I will try to examine others in my next articles.

10
$ 5.89
$ 5.89 from @TheRandomRewarder
Avatar for janatyler
3 years ago

Comments