Careers in Cyber

Does this sound familiar?  You keep seeing headlines about cyber security, about information security, usually when there’s been a loss of passwords or data, sometimes about large fines being levied on companies for poor practice. You’ve heard that there are lots of vacancies in the world of cyber and would like to look at a career in security. But you don’t know what choices there are, you don’t have good IT skills and you don’t know what skills you need.

This article will answer some (though probably not all) of your questions.

Before looking at what roles there are, let’s get the first big concern out of the way shall we? Do you need to be an IT ninja to work in information security?  The answer is a resounding NO (though for some – not all – roles it helps). Read on to find out why…

Broadly speaking, cyber security is split into three main role groups:

  • governance, risk and compliance (GRC), which relates to policies, processes, and, in some cases, training. These roles include consultants, analysts, auditors and trainers
  • offensive security, also known as red teaming, with the aim of trying to get unauthorised access to systems. Roles in this group include ethical hackers (penetration testers), social engineers etc
  • defensive security, also known as blue teaming, with the aim of trying to stop those trying to get unauthorised access to systems. Roles in this group include digital forensics, incident response, Security Operations analysts etc

GRC roles

These roles typically require little to no technical skills, though an understanding of technology helps.

People in these roles will probably spend their time writing and reviewing policies and other documentation, carrying out audits to ensure the organisation is complying with policies and / or industry standards, working with other staff to help them understand and implement the policies. At a more senior level they also encompass consultancy, working with clients to help them understand and improve their security posture.

It’s likely that people in GRC roles will spend time looking at industry standards such as ISO 27001 and NIST, regulations such as GDPR and industry specific requirements such as PCI DSS.

In terms of training, people in this group will be more likely to develop and perhaps deliver general security training rather than specific courses for highly technical staff.

In terms of training, a good basis would be the BCS Certificate in Information Security Management Principles (CISMP), and if you’d like to add some technical knowledge passing the CompTIA Net+ and Sec+ exams would be really good grounding.  There are courses around data privacy which are becoming more common too. Ultimately you’d be aiming for something like the ISACA Certified Information Security Manager (CISM), (ISC)2 Certified Information Systems Security Professional (CISSP) or EC-Council Certified Chief Information Security Officer (C|CISO) qualifications, but they require at least 5 years of practical experience as well as an exam pass.

Red Team (Offensive Security)

This is where many people think the really exciting part of security sits, being paid to test other companies’ defences and helping them improve their security. This is the realm of the ethical hacker, more properly called a penetration (pen) tester.

Pen testers are, by necessity, quite technical. Typically they’ll be able to write scripts and code in several different languages, including Bash and Python.  They’ll understand toolsets such as Metasploit, which is available for free on Kali Linux. (Incidentally, the bad guys will use pretty much the same toolsets for much of their work, and both groups will probably learn a lot about how to use them from YouTube!) They’ll also be able to write exploits, perhaps for use in Metasploit or elsewhere.  Oh, and they better understand network protocols and how firewalls work too.  Essentially, they need to know a lot about a lot of things in order to be very proficient, though it is possible to run a lot of these tools with very little knowledge.

There is a form of red teaming where people try to physically get access to premises and systems using social engineering techniques.  This typically involves carrying out research on the target company using OSINT techniques, before creating some kind of pretext (cover story) or getting in through open doors and windows.  The goal may be to try to access a data centre or other sensitive room in a building, or it may be to leave some kind of listening / communications device in a meeting room, or to see what documentation can be obtained. This is the sort of work that you may have seen in films like Sneakers, where teams of people are testing an organisation’s security capabilities. Skills needed for this type of role are more related to acting / improv, calmness under pressure and the ability to think quickly.  A good understanding of human psychology, empathy, body language and non-verbal communication is really helpful in this field.

Training for the red team can be very technical, or not technical at all. If technical, you probably need to look at something like CompTIA Net+ and Sec+ as a basic grounding, before then looking at something like the Offensive Security Certified Professional (OSCP) or CHECK Team Member (if in the UK). It’s worth saying that when it comes to the technical aspects, lots of practice with different packages, scripting languages and exploits is probably more beneficial than lots of certifications, though having at least one industry respected certification will be helpful.

It’s also worth noting that many red team members will have experience of operating as a blue team member (and vice versa), and the skills gained there will be useful for them in trying to defeat their opponents.

If you know the enemy and know yourself you need not fear the results of a hundred battles.
– Sun Tzu, The Art of War

If looking at the non-technical courses, then typically psychology and sociology are very useful. Experience of acting / talking to lots of different people is also helpful, and an understanding of verbal and non-verbal communications is also very useful.

Blue Team (Defensive Security)

The defensive teams are also likely to have some very technical people in them. They may not write exploits like some pen testers, but some do need to have a very deep and detailed understanding of how things work.

Digital forensics is a highly specialised field, and there are individual specialities within it. For example, someone may only deal with mobile devices, so will need to understand Android, iOS (for Apple devices) and Windows Mobile, amongst others. Some may look mainly at memory stores, or disk drives etc. They also need to know how to capture, store and examine data in a methodical way which can be replicated in court, using the ACPO Good Practice Guide for Digital Forensics (in the UK – other countries may have other standards).

SOC (Security Operations Centre) Analysts look at information coming from a range of sources such as log files, and are skilled at looking at the big picture to identify attacks or other threats.  They need to understand networks, protocols and firewalls, how systems are configured and how the whole network interoperates.  They also need to understand patching and malware, to evaluate likely effects and the best methods of combating those threats.

Training courses vary, though SANS are renowned for their very detailed courses, particularly in the forensics arena.  Again, CompTIA Net+ and Sec+ are good courses to start with before building up experience and looking at the more technical material available. Many courses will relate to the toolsets that the team member uses e.g. when using a Security Information and Event Management (SIEM) application, firewall apps etc. Blue team members may also take some of the same courses that the red team members do – remember Sun Tzu!

Summary

There is a lot of scope for people who are not technical – and have no desire to be technical – to work in Information Security.  In many cases, the key skills / attributes include patience, attention to detail, concentration, focus, diligence and curiosity, as well as people skills like empathy and communication.

As someone who has worked in the industry for over 30 years, since before it was even called security, I’d recommend it to anyone. There are so many opportunities, so many different roles, that there is bound to be something for everyone!

I should also mention that the company I work for, PGI, runs many of the courses mentioned above, or equivalents of them: I’m one of the instructors on the awareness courses…

The Great Hack

It would appear that the furore over Facebook / Cambridge Analytica and manipulation of elections hasn’t died down that much. I recently watched a documentary on Netflix called The Great Hack, and I’d recommend that you do too, if you can.

The programme provided a lot of the backstory to who was involved, how and when, as told by some of the people who were there. This included:

  • Brittany Kaiser was the Director for Business Development at Cambridge Analytica, and had previously worked on Barack Obama’s presidential campaigns.  She comes across as very naive at times, though towards the end of the show it becomes obvious that the penny drops and the seriousness of the situation is made apparent;
  • David Caroll, a professor who not unreasonably asked for a copy of all data that Cambridge Analytica held on him.  If not for him, the whole situation might not have escalated as it did;
  • Julian Whitehead, the former CFO at Cambridge Analytica. I was concerned at how little he seemed bothered by the morality of what was carried out by his company; and
  • Carole Cadwalladr is an investigative journalist at The Guardian and Observer newspapers in the UK.  She did a lot of the digging and legwork, trying to find people who would and could talk to her about things that had gone on.  Carole was the reporter who broke the news, and who continued to find and release fresh information as time went on.

Perhaps the most shocking aspect of the programme was the revelation that Cambridge Analytica had been involved in some way in elections around the world since the mid-2000s.  There was an expose of how their work influenced the elections in Trinidad and Tobago which showed how manipulative Facebook posts could be, as well as discussions of how the same techniques were used both for the Brexit campaign and for Trump’s election in 2016.

It was notable that Alexander Nix, the former head of Cambridge Analytica, declined to be interviewed, and also that Julian Assange / Wikileaks should be a part of the story. I didn’t know until I watched this that Steve Bannon, erstwhile Strategist at the White House under Donald Trump and former executive chairman of Breitbart news was a cofounder of Cambridge Analytica, or that Nigel Farage was closely linked with him.

It’s worth checking out Carole Cadwalladr’s TED talk in Silicon Valley, where she asks the heads of the big tech companies whether they are happy with the world they are creating. She suggests that it is now impossible to have a free and fair election because of abuse of their technologies.

She illustrated this ably by talking to people in South Wales to ask why they voted for Brexit: many had said they worried about immigration (she also spoke to someone who thought they were the only immigrants in the area), while others said the EU had done nothing for them yet they were surrounded by construction and facilities paid for by well advertised EU funding.

I’ve mentioned the perils of taking part in online quizzes and personality profiles “for fun” on Facebook. This documentary provides the evidence of how that information can be harvested and used to target specific people – never mind groups – who are deemed to be persuadable and who can swing an election result one way or another.

 

A new approach for 2019

I know it’s a bit hackneyed, but making New Year’s resolutions is part and parcel of this time of year. Wouldn’t it be great if everyone in security could all make the same one, to commit to doing the same thing? We’d need to bring others with us, like our IT colleagues, our enthusiastic amateur friends, and also particularly the media and marketing people around the globe.

Let’s try to see, report on and celebrate the positives, not just focus on the negatives.

The press and online media seems to be full of stories about data breaches, ransomware, data losses and other information security related catastrophes. When these occur, my LinkedIn, Twitter and Instagram feeds fill up with people talking about the breaches, how terrible they are, how companies can allow things like this to happen etc. I’m sure you’ve noticed it too. It’s almost like people are glorying in, celebrating even, the misfortunes of others.

Yes, we security professionals have a responsibility to identify weaknesses in systems and people, and try to mitigate those weaknesses. However, I think we have a greater responsibility to provide encouragement and support to our colleagues, acquaintances, friends and family. They’ve become much more aware now of the impact of their online actions, as illustrated in this story from the BBC. But many people have little or no idea how to protect themselves effectively.

If it feels like we keep having to repeat the same messages over and over, there’s a very good reason for that, which Rik Ferguson highlighted in a podcast with Jenny Radcliffe last year (2017). He said “Every day is someone’s first day online”. This is true, and I think we often forget that fact. This is why we have to keep repeating the basics, because these are new to people, and will continue to be so for years to come.

How do we change the narrative, from highlighting the negatives, to emphasizing the positives? Rather than say “there was a breach because such-and-such happened”, can we say “the breach could have been worse, but controls x, y and z helped make sure it wasn’t”? Rather than castigating individuals for missing a patch, can we not praise them for applying as many as they do? Those in the know already appreciate how hard it is to do even the simple things consistently well over the course of a year, and some things are bound to slip through the net.

I think it’s time for change. I think it’s time we recognised the excellent work so many people do. I think it’s time to shine the light on the positives.

Let’s try to see, report on and celebrate the positives, not just focus on the negatives.

Alexa – can you eavesdrop on us please

After my post last week about the Panorama programme here in the UK, there was a story in the news today about a couple in the US who were surprised by a call from a friend who had been emailed a recording of their conversation. Read all about it here. And no, I couldn’t believe Amazon’s excuse either!

What’s the deal with passwords?

In an earlier post I talked about password hygiene, and about the challenges we have in keeping passwords secret.  I realised that I’d missed the opportunity to talk about why we need passwords – so I thought I’d cover it now.

Computers will – if set up “normally” – ask for a username and password after you switch it on.  This is a process called authentication (though more commonly we call it logging in or logging on), and in the early days (before the Internet existed) was seen as quite a good way of ensuring that the person entering the username is who they say they are.  One reason why this is important is so that there is some accountability on systems: if something bad has happened, it can often be tracked back to a specific username. The person who “owns” that user name can be held accountable – and those who don’t “own” it can be discounted as the culprit.  It’s therefore quite a good protection mechanism for the other users.

Once that single computer was connected to lots of others, and particularly when connected over the Internet, some people found a challenge in trying to access those remote systems by trying to guess usernames and passwords (at a very basic level this is what hackers try to do).  Passwords which are easy to guess mean that the bad guys don’t have to work very hard to access your account.  Once they have access to your computer, they will often try to see what else they can get access to, such as your bank account, financial details, holiday plans etc.

Have a look at the image below:

image

It’s obvious that the most common passwords (and therefore the easiest to guess) haven’t changed much over the previous 5 years.  This is bad!

The bad guys use a range of software tools to try to break (or crack) passwords, and generally speaking the longer the password, the better.  But, length alone isn’t the answer.  If the password is just numbers, the bad guys “only” need to try combinations of 0 to 9 in increasing lengths i.e. 0,00,01,02,03 etc. If it’s just lower or upper case letters ie a to z or A to Z, then there are 26 variables which they need to try before moving on to a longer length.

Mixing numbers, upper and lower case letters and special characters (eg !@£$%^) gives a much longer set of variables which need to be tried, and this mix is what is called a complex password.  In all cases, the longer the combination of these the better, but the industry standard is a minimum of 8 characters long.  Personally, I prefer at least 15 characters, because the maths shows that with current computing power complex passwords of that length are very, very difficult to crack

Obviously, the longer and more complex the password, the more likely you are to forget it, which is why good password hygiene is required.  Password hygiene can be compared to personal hygiene, and more particularly your underwear.

image

So – keep your passwords to yourself, change them regularly, and don’t show them to anyone else!