📉

How Colleges Fail Computer Science Students

Jan. 7th, 2022

Cybersecurity is an emerging field with huge potential for any university graduate or computer science autodidact. This is due to an increased reliance on computer systems and the growth of smart devices by companies worldwide. According to Cybercrime Magazine, there are 3.5 million cybersecurity job openings globally, of which 500,000 come from tech needs in the United States alone. This need will not be going away soon, either. The Bureau of Labor Statistics predicts a 28% increase in the demand for information security analysts between 2016 and 2026–an increase of 28,500 jobs. Now let's explore why colleges are not preparing undergraduates for these amazing opportunities.

How We Got Here?

Society has been revolutionized since the technological age was introduced. The possibilities now seem endless in what the world can bring. However, these endless possibilities also translate into the diversity of cyber threats we could see.

1970’s - The First Computer Worm

Researcher Bob Thomas created the Creeper which could move across the ARPANET’s network, leaving a trail as it moved. Ray Tomlinson, the inventor of email, wrote the programme Reaper, which chased and deleted the Creeper. Reaper was the very first example of antivirus software and the first self-replicating programme, making it the first-ever ‘computer worm’ (2).

1980s - Birth of the Commercial Antivirus

Andreas Lüning and Kai Figge released their first antivirus product for the Atari ST, which also saw the release of Ultimate Virus Killer in 1987 (2).

1990’s - The World Is Now Online

The world soon progressed to go online. Organized crime entities saw this digitalization as a potential source of revenue and started to steal data from people and governments via the web. By the middle of the 1990s, network security threats had increased exponentially. In response, firewalls and antivirus programmes had to be mass produced and integrated to protect the public (2).

2000’s - Threats Increase

Cyber attacks started to become more prevalent. Malicious attackers took advantage of the openings they saw as the government started to crack down on crimes. Information security started to become in high demand, and professionals had to step up to the new challenge.

Present

Cyber attacks are growing at a rapid speed, and everyday more companies are putting their operations in control of technology. This results in huge security risks if not properly managed.

Skills Employers Require

  • Basic understanding of application architecture (FE, BE, DB), administration and management of systems, networking, and virtualization software
  • General programming and software development concepts and software analytics skills
  • Programming Languages - Java, C/C++, Scripting languages (PHP, Python, Perl, or Shell)
  • Bachelor's degree in Computer Science, Information Systems, or a related field

Generally, employers are unwilling to bring in those who want to learn and teach them on the job. This creates a problem because many skilled and experienced workers often choose more interesting emerging technologies, AI, ML, Data, etc.

⁉️ How We Do It

SecureState uses a combination of automated tools and manual testing to provide a hybrid approach that includes proactive and reactive security testing activities. Our team has decades of cybersecurity experience with some of the largest tech companies including AWS, VMware, Google and Nintendo.

Take the first step to security and schedule a call today!

📆 https://calendly.com/securestate/lets-talk-security