covid-19 – Dataconomy https://dataconomy.ru Bridging the gap between technology and business Tue, 20 Feb 2024 11:06:22 +0000 en-US hourly 1 https://dataconomy.ru/wp-content/uploads/2022/12/cropped-DC-logo-emblem_multicolor-32x32.png covid-19 – Dataconomy https://dataconomy.ru 32 32 Largest COVID vaccine study identifies small risks https://dataconomy.ru/2024/02/20/largest-covid-vaccine-study/ Tue, 20 Feb 2024 11:06:22 +0000 https://dataconomy.ru/?p=48762 In a landmark development, the largest COVID vaccine study ever, encompassing data from 99 million people across eight countries, has identified some potential risks associated with the shot. Researchers from the Global Vaccine Data Network (GVDN) analyzed the data to track any increases in 13 different medical conditions in the period after individuals received a […]]]>

In a landmark development, the largest COVID vaccine study ever, encompassing data from 99 million people across eight countries, has identified some potential risks associated with the shot. Researchers from the Global Vaccine Data Network (GVDN) analyzed the data to track any increases in 13 different medical conditions in the period after individuals received a COVID vaccine, reports Fox News.

Largest COVID vaccine study ever reveals some concerns

The largest COVID vaccine study, published last week in the journal Vaccine, revealed a slight uptick in neurological, blood, and heart-related medical conditions following vaccination, according to a GVDN press release. Notably, individuals who received certain mRNA vaccines exhibited a higher risk of myocarditis, an inflammatory condition affecting the heart muscle. Some viral-vector vaccines were linked to an increased likelihood of blood clots in the brain and Guillain-Barre syndrome, a neurological disorder.

The press release further stated potential risks like inflammation of the spinal cord after viral vector vaccines, and inflammation and swelling in the brain and spinal cord after both viral vector and mRNA vaccines.

“The size of the population in this study increased the possibility of identifying rare potential vaccine safety signals,” emphasized lead author Kristýna Faksová, Department of Epidemiology Research, Statens Serum Institut, Copenhagen, Denmark. “Single sites or regions are unlikely to have a large enough population to detect very rare signals.”

largest COVID vaccine study
The largest COVID vaccine study has identified some potential risks associated with the shot (Image credit)

Independent experts weigh in on the findings

Dr. Marc Siegel, a clinical professor of medicine at NYU Langone Medical Center and a Fox News medical contributor, while not directly involved in the research, commented on the results. He acknowledged the study’s findings reveal “some rare association of the MRNA vaccines and myocarditis, especially after the second shot, as well as an association between the Oxford Astra Zeneca adenovirus vector vaccines and Guillain Barre syndrome.”

Crucially, Dr. Siegel stressed that “these risks are rare,” citing other studies demonstrating that the vaccines significantly reduce the risk of myocarditis compared to contracting COVID itself. He emphasized that all vaccines have side effects, prompting the necessity of a risk-benefit analysis for each individual, considering the potential consequences of both vaccine side effects and contracting the virus itself, including long-term complications like brain fog, fatigue, cough, and heart issues.


How artificial intelligence can fight Long COVID


Dr. Jacob Glanville, CEO of Centivax, a San Francisco biotechnology company, echoed Dr. Siegel’s sentiment. He stated that the largest COVID vaccine study “is confirming in a much larger cohort what has been previously identified in the original studies during the pandemic,” referring to the rare occurrence of myocarditis and pericarditis with mRNA vaccines and blood clots with viral-vectored vaccines. He reiterated that “the odds of all of these adverse events are still much, much higher when infected with SARS-CoV-2 (COVID-19), so getting vaccinated is still by far the safer choice.”

largest COVID vaccine study
The largest COVID vaccine study ever is encompassing data from 99 million people across eight countries (Image credit)

Significance of the research and broader context

This largest COVID vaccine study forms part of the Global COVID Vaccine Safety (GCoVS) Project, a wider research initiative supported by the Centers for Disease Control and Prevention (CDC) under the U.S. Department of Health and Human Services (HHS). It’s worth noting that over 80% of the U.S. population has received at least one dose of the COVID vaccine, as per the CDC.

While the study identifies some potential risks, it reinforces the crucial message of weighing the risks and benefits of vaccination in consultation with healthcare professionals. The findings don’t necessitate a radical shift in perspective, but rather, provide further evidence for informed decision-making regarding individual health and protection against a potentially severe and long-term illness.

Remember: While some rare risks exist, the overwhelming evidence indicates that the benefits of COVID vaccination far outweigh the potential downsides, especially considering the significant risks associated with contracting the virus itself. For those with concerns, consulting a healthcare professional remains the most effective way to make informed and personalized decisions regarding their health.

Featured image credit: Daniel Schludi/Unsplash

]]>
Contactless payment usage has significantly increased in past three years https://dataconomy.ru/2022/09/13/contactless-payment-usage-has-increased/ https://dataconomy.ru/2022/09/13/contactless-payment-usage-has-increased/#respond Tue, 13 Sep 2022 13:11:47 +0000 https://dataconomy.ru/?p=28687 As per Lloyds Bank data, in the early stages of the epidemic, 65% of face-to-face payments were done using contactless payment via debit cards in June 2019, but by June 2022, this had increased to 87%. According to figures released in March by the banking sector trade association UK Finance, about £166 billion was spent […]]]>
  • As per Lloyds Bank data, in the early stages of the epidemic, 65% of face-to-face payments were done using contactless payment via debit cards in June 2019, but by June 2022, this had increased to 87%.
  • According to figures released in March by the banking sector trade association UK Finance, about £166 billion was spent in the UK using contactless technology in the previous year, compared to £80.5 billion in 2019.
  • According to Lloyds, 95% of restaurant bills in the UK are paid via contactless technology, which includes mobile wallets, and 83% of purchases at supermarkets are contactless.

According to data from Lloyds Bank, Covid-19 significantly changed how face-to-face payments are made, with nearly 90% of transactions now being contactless.

Contactless payment satisfied the need to socially distance

In June 2019, when the pandemic was just starting, 65% of face-to-face transactions were made using contactless debit cards, according to data from a UK bank; by June 2022, however, this number had risen to 87%.

Contactless payment usage has increased during the Covid-19 pandemic
During the Covid-19 pandemic, there was a drastic increase in the usage of contactless payment with debit cards

In June 2020, though, contactless debit cards were used for 72% of face-to-face transactions, and in June 2021, that number increased to 83%, according to the bank.

Contactless cards were initially made accessible in the United Kingdom in 2007. There was a £10 spending restriction at the time. This cap was raised to £30 by 2020 but has experienced major increases during the epidemic. It was raised to £45 in April last year and is now £100.

Gabby Collins, payments director at Lloyds Bank said, “The convenience of a contactless payment is clear when you look at the growth in this type of payment over time, with 87% of face-to-face debit card transactions now made using the technology.” 

Contactless payment usage has increased during the Covid-19 pandemic
The fact that most people already used cards made it easy to adapt to contactless cards

Customers can set their spending restriction up to £100 using Lloyds’ mobile app. Since its introduction in 2021, this service has been utilized by about 800,000 bank customers.

The Covid-19 pandemic accelerated the use of contactless technology. When the epidemic hit, individuals were advised to restrict physical contact, including currency usage. Because, unlike mobile phone payment applications, most individuals already used payment cards, contactless payment technology was a suitable alternative for cash. This prompted groups such as the elderly, who are notoriously sluggish to adopt new technology, to embrace it.

Contactless payment usage has increased during the Covid-19 pandemic
Payments via contactless technology have doubled in the UK from 2019 to 2020

As per figures released in March by the banking sector trade association UK Finance, about £166 billion was spent in the UK using contactless technology in the previous year, compared to £80.5 billion in 2019.

Based on the latest UK payment markets 2022 research, the pandemic had a revolutionary influence on the payments industry, accelerating the continuous drop in the use of cash payments while also slumping the use of debit cards following years of growing usage.


Three Trends in E-commerce Payments to be Concerned About


“It also led to changes in the payment types used. People made greater use of contactless payments, online banking, and mobile wallet channels, largely at the expense of cash payments,” said the report summary document.

According to Lloyds, 95% of restaurant bills in the UK are paid via contactless technology, which includes mobile wallets, and 83% of purchases at supermarkets are contactless.

]]>
https://dataconomy.ru/2022/09/13/contactless-payment-usage-has-increased/feed/ 0
Using AI in a heavily regulated environment: drug and vaccine inspection https://dataconomy.ru/2022/01/06/ai-regulated-drug-vaccine-inspection/ https://dataconomy.ru/2022/01/06/ai-regulated-drug-vaccine-inspection/#respond Thu, 06 Jan 2022 15:03:00 +0000 https://dataconomy.ru/?p=22462 In August 2021, the news about Japan suspending 1.63 million doses of the Moderna vaccine shocked the world community. It was revealed that the vaccine was contaminated with metal particles due to “human error specific to visually misjudging the required 1mm gap between the star-wheel and the stopper” of the machine that put the tops […]]]>

In August 2021, the news about Japan suspending 1.63 million doses of the Moderna vaccine shocked the world community. It was revealed that the vaccine was contaminated with metal particles due to “human error specific to visually misjudging the required 1mm gap between the star-wheel and the stopper” of the machine that put the tops on vials. This incident put the spotlight on vaccine inspection, and how it could be carried out at scale.

Clearly, new systems were needed. This mistake could have posed a threat to the lives of 1.63 million people getting their vaccination. 

New inspection methods needed

The pharmaceutical products are getting more complex, meaning that inspecting them is getting more complicated as well. This is reaching the limits of what is possible with traditional algorithms or manual visual inspection. The present pandemic situation, in particular, has made the whole global population dependent on the quality of COVID-19 vaccines. And it is a fact that biotech products are even more sensitive than other chemicals, which also results in more sophisticated packaging.

As data collection techniques become more efficient and widely available, many pharmaceutical companies are starting to consider Artificial Intelligence (AI) for improving their drug and vaccine inspections. Machine Learning (ML), a subsection of AI, takes advantage of this improved data collection to create algorithms that can detect defects in real-time with human-level accuracy.  For this purpose, ML engineers use supervised and unsupervised algorithms. Unsupervised algorithms detect very rare things such as mixed-up products, whereas supervised ones identify small objects like foreign body particles, pieces of glass or metal, hairs, etc. 

Current inspection approaches: why AI is better

Currently, pharmaceutical companies do drug and vaccine inspections using two approaches: manual visual inspection by human operators and classic Computer Vision (CV) algorithms. AI algorithms should be considered instead of manual inspection because they eliminate the chance of human error, thus providing more consistent and accurate results. At the same time, since using these types of algorithms reduces the number of workers involved in inspection, the remaining workers can focus on other tasks, which also secures labor costs for the company. 

In comparison to classic algorithms, ML algorithms represent a more advanced inspection technology. They are not programmed beforehand to do a task but instead learn how to do it themselves by analyzing their own performance and learning from mistakes. However, once the algorithm gets the best metrics possible, ML engineers can freeze it to reach the quality goals.

In addition, classic algorithms are often not feasible for an environment that is in constant change. They employ a more rigorous mathematical approach, whereas ML algorithms can be applied to any situation where there is available data and are therefore more suitable for visual inspections. AI algorithms process data in raw form which, in the case of drug inspection, includes images. 

Corporate innovation: easier said than done

Interestingly, the number of companies actually implementing AI in inspections is drastically smaller than of those willing to apply it. This is often explained by the absence of teams with cross-functional skills at a corporate workplace. For creating a well-functioning AI process, it is necessary to have a team with people possessing both the knowledge of IT development and machine learning science as well as quality inspection and pharmaceutical regulations, for instance. This frequently poses a problem in the corporate environment where every individual or department has a specific set of skills. 

Consequently, the solution cannot be implemented without consulting with the whole corporation. Normally, in this case, the company needs to build an entire innovation department from scratch with people of relevant skills, which often takes from 3 to 5 years. They also require material to work with. Therefore, companies also need to invest in the development of special hardware such as robots, cameras, servers, and screens for the interface to interact with robots or robotic arms. 

This is why tech companies that provide AI for drug inspections emerge. It is true that the teams of small companies are much more multifunctional simply because their size forces them to be so. Despite all the bureaucratic drawbacks, tech innovators do set up pilot projects with pharma giants to demonstrate how easy it is to implement AI. This often simplifies the change management process at the corporation because they can co-work with a team of a small company that already has a solution for them and can help them with introducing it at the workplace. As a result of such cooperation, pharma manufacturers endure fewer false rejects, deviations, and recalls. 

Challenges on hand

Currently, these tech companies face two main challenges. The first one is improving the algorithms to reach 100% accuracy. It is proved that the visual inspections with AI reach up to 90% precision, demonstrating some room for improvement. The second challenge is overcoming the barriers to entry in the pharmaceutical environment. All the new technologies need to comply with FDA and Eudralex regulations to be used in the pharmaceutical environment. 

Eudralex Volume 4 Annex 11: Computerised systems provide a perfect example. According to this directive, the manufacturer of the medical device which is going to be implemented in the pharmaceutical industry must guarantee proper validation, electronic archiving and signatures, risk management, and security. Only this regulation alone requires an extra effort from AI developers while they also need to ensure a high level of security considering the confidentiality of corporate data and regularly validate the computer system. Considering that the AI algorithms are a “blackbox” by essence, validating and controllblack boxthe process steps might be sometimes challenging.

Postponing change has fatal consequences

The pandemic has clearly demonstrated the necessity for a balance between having strict regulations and proper change management in pharmaceutical companies. Even with all the pressure and time constraints, it took more than 1.5 years for the most advanced enterprises in the field to actually start the mass production of COVID-19 vaccines. This is clear evidence of pharma manufacturers lacking flexibility. 

The innovative approaches for drug manufacturing like AI, for example, have previously been neglected. Most pharmaceutical companies put “extending human lives” and “improving the quality of global healthcare” in their mission statements. This, of course, explains why there are so many standards and procedures in place: pharmaceutical companies cannot risk the life of patients. However, in a state of emergency, when the destiny of the whole global population is at stake, it is the duty of these companies to deliver a result as soon as possible at any cost and measure, especially when the innovative solutions are here for them to use. Refraining from doing so simply contradicts their corporate values.

Finding the right AI solution

And now the main question comes: how to find the provider of the best AI service for a pharma business? Since the concept of visual inspection with AI is very new, this is exactly why startups are working on the development of the relevant algorithms. It is, therefore, recommended to look for those startups that have references from pharmaceutical companies. Usually, the young companies working in this area do not have the legacy of developing other solutions. This means that the firms use their tech solution in full capacity: working on and improving exactly the few most important features for drug inspection instead of taking other use cases.

]]>
https://dataconomy.ru/2022/01/06/ai-regulated-drug-vaccine-inspection/feed/ 0
How artificial intelligence can fight Long COVID https://dataconomy.ru/2021/12/01/how-artificial-intelligence-long-covid/ https://dataconomy.ru/2021/12/01/how-artificial-intelligence-long-covid/#respond Wed, 01 Dec 2021 12:35:21 +0000 https://dataconomy.ru/?p=22393 Clinicians pivoted their AI efforts to engage in the battle against the COVID-19 pandemic. How can it help those with persistent symptoms – the so-called “long COVID”? The emergence in March 2020 of the COVID-19 virus as a pandemic had a profound effect on the demand for health services that continues to this day. The […]]]>

Clinicians pivoted their AI efforts to engage in the battle against the COVID-19 pandemic. How can it help those with persistent symptoms – the so-called “long COVID”?

The emergence in March 2020 of the COVID-19 virus as a pandemic had a profound effect on the demand for health services that continues to this day. The impact on laboratory services was particularly notable in the early days of the pandemic. A study helmed by Thomas J S Durant of the Department of Laboratory Medicine at the Yale University School of Medicine found that from late February to mid-April 2020, more than 870,000 COVID-19 tests were administered in the U.S., but overall lab testing went down significantly. Quickly, COVID had become a burden that was affecting other laboratory functions.

But some hospital laboratories had an ace up their sleeves: Artificial intelligence (AI). Some developed algorithms to predict the likelihood of a patient contracting COVID based on demographic data and vaccination history to prioritize then-limited testing resources. Some adapted existing projects to predict respiratory failure.

Some retrained radiological imaging AI projects were created to speed the diagnosis of thoracic ailments. Others used AI and machine learning (ML) modeling to find which patients needed less attentive care and those likely to require intubation (the procedure that’s used when you can’t breathe on your own). Massachusetts Institute of Technology (MIT) researchers also developed an AI model that could distinguish even asymptomatic COVID-19 sufferers with startling accuracy by analyzing recordings of coughs collected over mobile phones. The model was adapted from algorithms already proven to detect asthma and pneumonia accurately.

In short, AI and ML have eased healthcare burdens by de-escalating patients to free up resources, prioritizing testing for those most likely to have been exposed, and providing diagnostics with novel forms of data capture. While this benefits the predictive and early treatment phases of the COVID protocol, AI must also play a role in post-acute cases.

While most people recover from a COVID infection within a few weeks, some have symptoms that linger. Long COVID – or post-acute COVID, or chronic COVID – is defined as symptoms lingering for 12 weeks or more after infection that can’t be explained by another existing or recently acquired condition. The list of symptoms is long: trouble breathing, headaches, fever, “brain fog,” heart palpitations, joint or muscle pain, and changes in smell or taste are just a few. Long COVID can result from the initial infection’s damage to the lungs, the heart, kidneys, skin, and brain – practically any organ in the human body. And Long COVID isn’t just a problem for those who were severely ill or hospitalized because of COVID; it can manifest in COVID patients who were asymptomatic during the acute phase of the infection.

We can leverage the front-line AI and ML technologies to help manage long COVID suffering. Predictive models could ascertain the likelihood a patient could suffer from post-acute COVID, determine which patients are suffering from non-infectious consequences of COVID, like isolation and loss of income, and comb through the chemical composition of an entire library of pharmaceutical treatments that may be effective at treating post-COVID symptoms.

We have an enormous body of data to train artificial intelligence and machine learning platforms. But there is an impediment to its success: siloing.

When COVID-19 was first declared a pandemic, health care providers had to scramble to adopt or adapt AI tools to address their specific challenges. This meant that modeling and algorithm development took place in-house rather than in a shared forum. This leads to no guarantee of consistent data collection or output format that could be shared with other providers, even within the same region, where shared results would be most beneficial.

Interoperation of disparate systems is a vital component of a comprehensive approach to predicting, detecting, and treating COVID-19 and its post-acute complications. This requires standardization of reporting formats on a healthcare authority or statewide basis, ideally compliant with Centers for Disease Control (CDC) protocols. Data collection procedures must be formalized, digitized in a form that is resistant to error, and brought as close to the patient as possible for maximum accuracy.

AI and ML technologies offer nearly boundless opportunities in the life sciences field—modeling potential disease hot spots, discerning the desperately ill who require attentive treatment from those with a mild illness that can be self-managed, and predicting what resources will be needed and where to battle the pandemic. We have a head start on fighting Long COVID and the related symptoms by preparing for interoperability.

]]>
https://dataconomy.ru/2021/12/01/how-artificial-intelligence-long-covid/feed/ 0
How Business Intelligence Creates Collaboration in the Workforce https://dataconomy.ru/2021/02/05/how-business-intelligence-creates-collaboration/ https://dataconomy.ru/2021/02/05/how-business-intelligence-creates-collaboration/#respond Fri, 05 Feb 2021 10:47:07 +0000 https://dataconomy.ru/?p=21693 Collaboration is a key driver of success in business and one of the hardest things to achieve in today’s work climate. Businesses are juggling remote workers, hybrid workers, and everything in between since the pandemic’s onset. Business intelligence can be crucial in creating collaboration. According to the U.S. Census and Bureau of Labor Statistics, remote […]]]>

Collaboration is a key driver of success in business and one of the hardest things to achieve in today’s work climate. Businesses are juggling remote workers, hybrid workers, and everything in between since the pandemic’s onset. Business intelligence can be crucial in creating collaboration.

According to the U.S. Census and Bureau of Labor Statistics, remote working between 2005 and 2017 went up by 159%; however, despite this, FlexJobs noted that only 3.4% of the workforce worked from home. Of course, this was all before COVID-19 hit businesses hard, forcing most non-essential workers to work remotely. And perhaps to everyone’s advantage, this trend doesn’t seem to be going anywhere too fast. A recent survey demonstrated that 82% of respondents enjoy working from home, and 66% felt they were more productive working remotely. It seems that the future of work has changed dramatically this year, forcing businesses to put various solutions in place to support the changes that COVID-19 brought about.

According to Zapier CEO Wade Foster, who has hundreds of employees working remotely,  “companies who don’t have effective systems in place are winging it in a lot of areas right now. They’re going to have a hard time with this sudden transition. They are being thrust into an environment where they have no structure.” He told Computerworld that the “wrong type of management, misaligned culture, and lack of essential tools” could create a negative remote working environment.

Business intelligence can be the catalyst

One of the “essential tools” to create a successful, collaborative work environment is having an automated centralized Business Intelligence platform, cloud-based, that is utilized across the organization’s BI landscape. Today, businesses are dependent on data to make important decisions. Gone are the days where only business analysts were accessing data; today, the entire organization or anyone who needs to make a decision – whether big or small – needs to have access to the same data to collaborate with team members immediately. The data exists in silos and needs one centralized solution that will allow visibility to the entire landscape. A cloud-based, centralized Business Intelligence platform provides the internal framework where everyone has access to the same semantically defined, usable and trustworthy data right off the bat.

This has been particularly useful when collaboration is required to solve a business problem for a large organization. So much time was wasted in the past, and projects were stalled by the “black hole” of manually discovering, organizing, preparing, and cleaning data for analysis. Sometimes this manual work would take months. The automation of metadata lets people get on with collaboration by accessing the same data across the entire company. A person working in California will have the same data that a remote worker in Australia will have, and if they are on the same team, they are looking at the same information and able to collaborate on how to proceed to the next step.

It is now more crucial than ever to have a centralized, automated business intelligence platform. As workers become more scattered across the world, data is becoming more centralized, creating a collaborative environment within the remote workspace. The ability to trace the origin of metadata with a view of the entire data lineage, and with data discovery trace data immediately, will significantly impact the speed in how decisions can be made by providing the relevant information to make such decisions immediately, creating the internal framework of a collaborative workforce.

Who knew that COVID-19 would change the way businesses operate forever – making us more remote yet cohesive and collaborative since the nuts and bolts of the company are being forced to centralize? Companies worldwide are unifying through centralized data; having the right information, no matter where you are.

]]>
https://dataconomy.ru/2021/02/05/how-business-intelligence-creates-collaboration/feed/ 0
Cyber-attacks increase threefold, yet there are 4m unfilled cybersecurity positions https://dataconomy.ru/2021/01/14/cyber-attacks-increase-threefold-4m-unfilled-cybersecurity-positions/ https://dataconomy.ru/2021/01/14/cyber-attacks-increase-threefold-4m-unfilled-cybersecurity-positions/#respond Thu, 14 Jan 2021 12:39:14 +0000 https://dataconomy.ru/?p=21638 In 2020, the world experienced an unprecedented increase in cybercrimes amid COVID-19. In fact, data breaches increase 273 percent in the first quarter, compared to 2019, according to a new study from cloud computing company Iomart. Thanks to the additional vulnerabilities that opened up as people work from home, moves to take everything digital and conduct all […]]]>

In 2020, the world experienced an unprecedented increase in cybercrimes amid COVID-19. In fact, data breaches increase 273 percent in the first quarter, compared to 2019, according to a new study from cloud computing company Iomart.

Thanks to the additional vulnerabilities that opened up as people work from home, moves to take everything digital and conduct all business online, and the general confusion caused by the pandemic, cybercriminals have taken full advantage of the situation.

That’s a significant problem on its own, but there’s another issue at hand that makes the situation even worse.

According to a report by (ISC)2, the number of unfilled cybersecurity positions now stands at 4.07 million, up from 2.93 million this time last year. This includes 561,000 in North America.

The shortage of skilled workers in the industry in Europe has soared by more than 100 percent over the same period, from 142,000 to 291,000.

The report suggests a number of remedies for this situation, including in-house training, bringing employees across from other IT areas and retraining them, and increasing efforts to hire with aptitude in order to bring them up to speed on cybersecurity quickly.

One company has been helping to plug this gap. Cybint – a global cyber education company – recently partnered with LCC International University, an American-style university with students from over 50 countries, to create the Cybint Bootcamp.

Cybint also recently partnered with Israel-based web data provider Webhose and threat protection platform IntSights to provide a more well-rounded learning experience for Cybint users. These companies are part of the company’s effort to join forces with leading cyber technologies, bolstering the tools at its disposal to further reskill the workforce and upskill the cybersecurity industry.

And that’s important, because the shortfall of talent in the cybersecurity industry, combined with the rapid growth in attacks and breaches, is going to need to be dealt with quickly.

“We like to compare the cybersecurity market to that of coding and computer programming a few decades ago,” Roy Zur, CEO and founder at Cybint, told me. “Many of the first pioneers in this field were self-taught or learned by doing, mainly because traditional higher education just hadn’t caught up yet and employers were looking for the skills. Fast-forward, there are coding bootcamps and academies dedicated to this field as an alternative to degree education. Cybersecurity is similar in the way that the demand exists, but the skilled individuals aren’t necessarily coming out of higher education, and if they are, their skills are not always practical or relevant to real life. We believe that there is a huge opportunity for cyber professionals to learn skills quickly and effectively through intensive career bootcamp that are focused on the most in-demand job roles in cybersecurity.”

Security firm McAfee estimates the cost of cybercrime in 2020 reached $1 trillion, a figure that includes both the losses incurred and the amount of money spent on cybersecurity. If businesses are going to get a handle on these costs – which represent a 50 percent increase on 2018 – they are going to have to move fast.

So how long does someone have to train in cyber to become effective and gain employment in the field?

“Traditionally, it’s a matter of going through college and certification,” Zur said. “Alternatively, it could be as quick as three months in the full-time Cybint Cybersecurity Bootcamp. My extensive background in cybersecurity military training mixed with my CPO’s background in building career boot camps at MIT has allowed us to put together a learning experience that is incomparable to what’s currently available. It’s practical, highly-focused, and interactive – exactly the experience that employers are looking for in their candidates.”

That focus on getting students from starting the course to being employable is important to Cybint, and crucial for businesses everywhere.

“We are truly career-focused,” Zur said. “Our end goal is to help our Bootcampers land high-paying and long-term opportunities in the market. We’ve tailored the Cybint Bootcamp and our business model to achieve this to ultimately close the workforce and skills shortage in cybersecurity”

So what’s next for Cybint?

“There are quite a few avenues we can take as we scale,” Zur said. “However we plan to stay true to our mission of tackling the workforce shortage and skills gap through skills learning and collaboration. With that said, we plan to offer the Cybint Bootcamp in more locations worldwide through our partners and expand the cybersecurity roles we train for.”

One thing is certain. With such a huge increase in cybersecurity attacks, and the huge skill gap we’re currently experiencing, 2021 is already set to cost organizations as much as it did in 2020. Those willing to move across to cybersecurity can see this as an opportunity – the cyber market is forecasted to grow to $248.26 billion by 2023, making it a lucrative area, and one that may rival that of other high paid IT roles, such as data science, analysis, and engineering.

This article originally appeared at Grit Daily, and is reproduced with permission.

]]>
https://dataconomy.ru/2021/01/14/cyber-attacks-increase-threefold-4m-unfilled-cybersecurity-positions/feed/ 0