Dipping My Feet into Cyber Security and IT Audit

One of the advantages of the Information Technology and Management program at The University of Texas at Dallas is that it has the perfect curriculum design for learning the fundamentals of many pillars of IT. Of course, it is important to pick one focus and develop your skills in it. But with every passing day, the tech industry is evolving in a way where cross-functions and integrations are the norm. You need to be able to wear many hats. So, it makes sense to explore topics which might not really be your focus, but still fall in the technology umbrella. This is why, I was happy with my decision of taking IT Security and IT Audit and Risk Management, two subjects that enhanced my basic knowledge of some of the most promising and vital aspects of Information Technology.

Photo by Markus Spiske on Unsplash

My cyber security class, led by the profound Prof. Nate Howe, was eye-opening as it got me well-versed with so many IT risk and security aspects. I understood the entire structure of IT security functions in an organization with a sneak peek into responsibilities of the Chief Information Security Officer. The class took me to interesting avenues like the CIA objective, Business Continuity Planning and Disaster Recovery, ransomware and the anatomy of an attack, TCP/IP basics, secure – SDLCs and IT Control Frameworks. The class ended on a high note with guest speakers and industry professionals coming in to give demos of Kali Linux and penetration testing. It sure got me excited about exploring Kali Linux more, in the future.

The IT Audit and Risk Management class was a lot of fun, despite being so full of theory. This was because of the cool Prof. Joseph Mauriello, who always kept us engaged with his sense of humor and class-end quizzes. This class was the reason I became a member of the student chapter of ISACA – a club that had the best meet-ups and the most delicious food. In Prof. Mauriello’s class, I learnt the fundamentals of auditing IT governance controls, operating system and network controls, types of DOS attacks, and Risks associated with different IT functions and ERP systems. It is thanks to this class, that I know about the ACL software, the Sarbanes-Oxley Act, and the Fraud triangle.

The IT Audit group

While these subjects were not in line with my analytics focus, taking them was absolutely worth it as they served as essentials training of fundamental IT concepts. I also made some really cool friends (Hi Micah, Marie, Jeyaraj, Chloe, Diksha!) whom I enjoyed spending classwork and project time with. And that brings us to the professor, whom I have thanked several times but my series cannot be complete without him. I call him Master Yoda for without his guidance, I wouldn’t have been able to walk even two steps in this long path. I’m talking about the Program Director of MS Business Analytics at UTD, Dr. Bill Hefley.

Team Travelytics with Dr. Hefley

You might think that it is weird for me to be talking about Dr. Hefley in a post describing my experience in subjects that do not fall under his Business Analytics domain. But that is precisely why Dr. Hefley is an extremely special teacher to me. I had joined UTD as a MS BA student and had moved to ITM after my first semester as it aligned better with my experience and goals. Despite this, Dr. Hefley has continued to be my guardian angel. As faculty advisor for Travelytics, he has made sure everyone in our team has someone to go to. To me personally, he has been someone I can always write to (and I hope it continues to be that way) or walk up to. Every interaction with him has been warm and comforting. With his fun one-liners and cheerful yet informative emails, he is someone who is always there to cheer me up and keep me going. So, at the risk of boring you with my gratitude one more time, Dr. Hefley – THANK YOU.

ALSO SEE Saying “Hello, old friend” to Statistics and Analytics

This is the ninth post of my #10DaysToGraduate series where I share 10 key lessons from my Master’s degree in the form of a countdown to May 8, my graduation date.

Staying in the Loop with Python, the Queen of Data Science

My on-and-off relationship with Python began a few months before I started my Master’s degree. When I knew that I was going to turn towards IT, it was a no brainer that I had to raise my coding game. I had learned C programming during my engineering days but that was almost a decade ago. So, to go back to my roots, I took a weekend course in object-oriented programming with Java. While it was a lot of fun, it became clear to me that Java, though brilliant, was more of a mobile app development tool (no offense, Java lovers!). There was another language that reigned over the data science kingdom and for any chance of success as a data analyst, I had to woo her.

I started learning Python with the MIT OCW course (edX) on Introduction to computational programming with Python to understand the basic data structures and some beginner-level programs. While I got through the basics, I could not complete this course as, after a point, I found it to be a bit dry. And that was that. At UTD, I was already making good progress in my analytics learning trajectory thanks to my work with R programming. So there was no need to hurry things up with another language. However, as things progressed with my club Travelytics, and I came across competitions online, I couldn’t delay getting my hands dirty with Python anymore.

So, I dived right in with Kaggle Learn‘s wonderful data science track which started with 7 hours of Python, including all the basics from variables, lists, loops and functions to important libraries and elementary programs. This was followed by my internship at iCode where I worked with Python projects and also trained over 50 students in the foundations of Python and machine learning. The hands-on exercises and projects at iCode, like building a movie recommender system, were of great help in laying down the foundations of Python for data science in my brain.

Joseph Kim in one of his Python sessions at UTD

Back at UTD, it helped that my friend, Joseph Kim, who was the President of the data science club, conducted some amazing hands-on sessions for people to learn Python basics. Attending these sessions helped me, and many others, stay in the loop (pun totally intended). Then came my own Python research for my facial recognition project to solve crime tourism, at the end of which I had adapted three simple python programs that detect and recognize faces in real time. This was my most memorable time spent with Python programming, as I was able to see some tangible results generated by code written by me.

In the last few months, I have been following the extraordinary free YouTube lessons of Krishna Naik. His Machine Learning playlist is the most valuable resource I have found online that helps me practise everything from the use of impressive data science libraries like NumPy, Pandas and scikit learn to data visualization exercises with matplotlib and Seaborn. He is also an excellent coach in analytics concepts like entropy and Gini impurity, and machine learning algorithms like regression, k-means clustering, k-nearest neighbors, decision trees and ensemble methods.

We are truly fortunate to live in a world and time where so many resources are available for anyone who has an Internet connection and wants to learn. I am currently working my way through Kiril Eremenko’s well-acclaimed Udemy course on Python for data science. While all these wonderful online resources have their charm, nothing comes close to in-class training. This became evident in my object-oriented programming class with Dr. Nassim Sohaee. Her diligent classwork and challenging assignments, which I am still working on, have been excellent tools to help me understand the nuts and bolts of object-oriented design and the anatomy of Python programming. I have worked in various projects dealing with loops, functions, classes, inheritance and exception handling. In addition to all the data science exercises, this class has helped me gain more confidence in leveraging Python as a powerful programming language in the time to come.

ALSO SEE Saying “Hello, old friend” to Statistics and Analytics
Diving Deep into Business Analytics with R Programming

This is the fifth post of my #10DaysToGraduate series where I share 10 key lessons from my Master’s degree in the form of a countdown to May 8, my graduation date.

Balancing Up with SQL and Database Management

I had understood very early on while learning the basics of data science that the three pillars of a sturdy analytics structure are statistics, a programming language, and database management. So, after covering the first two in my previous posts, it’s natural that I move to database foundations.

During Fall 2018, I started learning the basics of databases in Dr. James Scott’s class. The man is a gifted speaker and entertainer. His class was full of marvelous impressions, anecdotes from his variety of experiences, and exciting PowerPoint presentations. It was here that I understood the concept of data modeling with topics like primary and foreign keys, Entity Relationship Diagrams (ERD) , schemas and sub-schemas, weak and strong relationships, and Normalization . However, the most important part of this class was that it got me started in one THE MOST IN-DEMAND TOOL asked for in every job role I desire – SQL!

Photo by Tobias Fischer on Unsplash

As my friend Ankita loves saying – SELECT is written in our star(*)s. It was a delight to work on class assignments that tested our knowledge of dependencies, NULL values, SQL functions, relational operators, joins, sub-queries, and views. We also got into the basics of transaction management using SQL. And since we had worked extensively with Relational Databases for most part of the class, Dr. Scott spent the last leg of our semester teaching us the basics of NoSQL and MongoDB. It formed a great runway for my future big data endeavors.

My SQL and database learning during this semester culminated with a project where I got my hands dirty with some data munging, database modeling and even regression using SQL and R. Just cleaning this data before we can perform any kind of retrieval was a task in itself. Thanks to this class, I find myself proficient in creating ERDs, working with various SQL joins and clauses to retrieve simple as well as aggregated data from complex data sets.

ALSO SEE Saying “Hello, old friend” to Statistics and Analytics
Diving Deep into Business Analytics with R Programming

This is the third post of my #10DaysToGraduate series where I share 10 key lessons from my Master’s degree in the form of a countdown to May 8, my graduation date.

Diving Deep into Business Analytics with R Programming

When a class is named after your graduation major, and one of the most popular disciplines in the present world, you know it’s going to be pivotal in your learning path. BA with R proved to be just that. The brilliant Dr. Sourav Chatterjee made it clear right at the beginning that R programming is going to be used just as a tool (which it is) to understand and master the nuances of business analytics. Having said that, his course material left no stone unturned in taking us through all aspects of R programming needed for data science.

I had worked a bit with Java and PHP, but this was my first experience with the R programming language. I started with an introductory course on Datacamp to quickly learn the very basics of R like vectors, matrices and data frames. Then, in class, Dr. Chatterjee proved to be a dedicated and patient professor as he started with basic manipulations and sample generation in R and then quickly moving to the foundations of data analytics. We got familiar with libraries like tidyverse, forecast, gplots and toyed with data visualization using ggplot on some interesting data sets. We created several plots, graphs, charts, and heatmaps, before scaling up to larger data sets.

This was followed by some of the most important things a business analyst/data scientist learns in his career. So far, everything looked pretty straight forward to me but now was the time to push boundaries and actually dive deep into analytics. I was introduced to dimension reduction, correlation matrix and the all-important analytics task of principal component analysis (PCA). I learnt how to evaluate performance of models, create lift and decile charts, and classification with the help of a confusion matrix – all with just a few lines of code. As Dr. Chatterjee explained time and again, it was never about the code. It was about knowing when and how to use it and what to do with the result.

Dr. Sourav Chatterjee’s BA with R class

We then followed the natural analytics progression with linear and multiple regression where I learned about partitioning of data and generating predictions. This was followed by a thorough understanding of the KNN model and how and when to run it. By now, I was beginning to get a hand of problem statements and the approach to take to solve them, thanks to class assignments on real-world scenarios like employee performance and spam detection. Through the examples done in class, it was easy to grasp the concepts of R-squared value, p-value and the roles they play in model evaluation. It was in this class that I understood logistic regression, discriminant analysis, association rules for the first time and I have been working on them ever since, in every data science course or project that I have taken up.

All of this knowledge and Dr. Chatterjee’s guidelines were put to use in the final project where I worked with a group led by the talented Abhishek Pandey on London cabs data. After rigorous work on large data sets downloaded/extracted from various sources, we trained a model to predict arrival times for cabs by comparing RMSE across random forests, logistic regression, and SVMs. It was a great way to put into practice everything we had learned over four months.

And with that, I had laid a robust foundation in data analytics, and was ready to build it further in the time to come. By January 2019, I was confident to dive into analytics projects and work on complex data sets to generate prediction models using the tools taught by Dr. Saurav Chatterjee.

ALSO SEE Saying “Hello, old friend” to Statistics and Analytics

This is the second post of my #10DaysToGraduate series where I share 10 key lessons from my Master’s degree in the form of a countdown to May 8, my graduation date.

Facial Recognition with Python, OpenCV and Raspberry Pi

Everybody Loves Recognition! Technically, the definition of recognition is – Identification of someone or something or person from previous encounters or knowledge. But how can it be used to solve real-world problems? This was the premise of a facial recognition project I built using Python and OpenCV on a Raspberry Pi. All the code for this project is available on my github page.

The Problem

Crime tourism, which is very different from ‘crime against tourists’, refers to organized gangs that enter countries on tourist visas with the sole intention to commit crime or make a quick buck. Residing in their destination countries for just a few weeks, they seek to inflict maximum damage on locals before returning to their home countries. It’s something that has been picking up all over the world but especially in Canada, US, Australia.  Here’s an excerpt from a Candian Report:

“Over the weekend, we got a notification that there were at least three people arrested,” he said. “And there were two detained yesterday in a different city. It’s just a growing problem.” When police in Australia broke up a Chilean gang in December, they thanked Canadian police for tipping them off. Three suspects who’d fled Ontario and returned to Chile turned up in Sydney, Australia. The tip from Halton Regional Police led to eight arrests and the recovery of more than $1 million worth of stolen goods.

While the tip came in handy, it would be much more effective to have portable facial-recognition devices at airports and tourist spots to identify criminals and stop them before their crime in a new destination.

The Solution

I used Crime tourism as an example problem to demonstrate the use of facial recognition as a solution. It started with buying a Raspberry Pi v3 ($35) and a 5 MP 1080 p mini Pi camera module ($9) and configuring them.

Then, using Adrian Rosebrock’s brilliant tutorial, I embarked on a 10-hour journey (full of mistakes made on my part) to compile OpenCV on my Raspberry Pi! Here are some important things to remember from this compilation expedition:

•You need to expand your file system to be able to use the entire 32 GB of Pi memory •You need to create a Python 3 virtual environment and always make sure that you’re working inside that environment
•Before you begin the compile process – Increase the SWAP space from 100 MB to 2048 MB to enable you to compile OpenCV with all four cores of the Raspberry Pi (and without the compile hanging due to memory exhausting).
•After installation of NumPy and completion of your OpenCV compilation, re-swap to 100 MB

Python Code for Facial Recognition

I then followed MjRobot’s tutorial to write three simple Python programs for the actual facial-recognition using OpenCV. The object-detection is performed using the Haar feature-based cascade classifiers, which is an effective object detection method proposed by Paul Viola and Michael Jones in their paper, “Rapid Object Detection using a Boosted Cascade of Simple Features” in 2001. It is a machine-learning based-approach where cascade function is trained from a lot of positive and negative images. These images are then used to detect objects in other images. Haar Cascades directory is readily available on the OpenCv github page.

Demonstration

I presented this project on my last day as the President of the UTD club – Travelytics. There, I conducted a live demonstration of the Pi cam capturing my face after I run the first Python program, training the model with the second program, and real-time facial recognition using the third program. Here’s a glimpse:

This project proved to be an excellent route for me to learn the basics of Python, OpenCV, computer vision, Raspberry Pi and how we can implement a low-budget, effective facial recognition solution to complex problems.