Projects

Below are a series of my projects. Some of which are from academia, and others which I have worked on outside of school. Many hours were spent on these projects and I have done what I can to provide the latest versions of each. To prevent plagarism, please use your Google account to request accesss. If you do not have a Google account, please send me an email from your company issued email.

Unsubscribe from those pesky emails! Chrome Extension: unsubME

Context

Having my email saturated with marketing promotions made me realize I had been spending a lot of time archiving emails. I wanted to maximize my productivity in a way where I could leverage Google's API to automate the process -- So I did just that.

Goals

The long-term goal remained simple, however, I quickly realized how requirments change throughout a project. As I completed feature after feature, I found that more features were needed to allow for my idea to materialize into a full-fledged solution. At its current state, the extension does exactly what it sets out to do. Future releases will address any bugs reported by users of the extension as well as improve on the success rate of subscription without user intervention.

Execution

  • ●   Learned how to set up oAuth2 and request access to users' data via an authentication token.
  • ●   Configured the manifest XML file to setup the basic structure of the extension .
  • ●   Used the Gmail API to retrieve the messages of a Gmail user and parsed for the relevant links.
  • ●   Implemented message passing in conjunction with event handlers to transfer data between JS files in the application environment.
  • ●   Used JavaScript to dynamically create the HTML at runtime and disply an aggregated array of unsubscribe links to the user.
  • ●   Saved user data using 'Chrome Cloud Sync' so that the extension would maintain state regardless of a user's device.
  • ●   Navigated around various asynchronous issues due to the nature of web request and client execution of code.
View/Download Extension
Data, data, and more data Python:   ServiceNow ERP Migration

Context

For my C.I.S final project, our group was tasked with migrating a large volume of data into Cal Poly Pomona's new ERP platform, ServiceNow. The data would consist of all eHelp pages within the school's website that were previously coded in a non-HTML based markup language called 'MediaWiki' format. Our goal was to derive an efficient solution that could parse through the unconventional markup format and convert these pages to HTML within a timespan of ten weeks.

The Details

Our clients whom sponsored this project were four brilliant individuals that headed our campus's IT department. My team consisted of five individuals in the same program, all with varying levels of experience and exposure to IT. After taking the initative to spew out the first dozen lines of code, our Scrum Master assigned me the role of 'Lead Developer.' Aside from these two roles, the remaining three positions were a sub-team of two QA testers and an another developer to assist in coding. Having previous experience with Python, I pushed my team towards the notion of using a Python based solution due to its ability to parse data efficiently and effectively.

Execution

  • ●   Held weekly meetings with our client to ensure our project was in alignment with the clients' vision.
  • ●   Used 'Slack' as a channel to organize our discussions, post solutions, and communicate on a daily basis.
  • ●   Held team sprints once a week to set goal in motion and benchmark project status.
  • ●   Parsed through 100,000+ lines of an XML Dump file from the previous MediaWiki platform and extracted all eHelp page titles.
  • ●   For each page, the HTML source would be extracted using a mock browser object to visit the page and record the HTTP response.
  • ●   The source code for each page would be parsed further to grab only the necessary subset of data and associated meta.
  • ●   The output of our script, a single XML file, would contain all the captured HTML and meta data in an organizational structure.
  • ●   Finally, we would bulk import all converted pages via the import of our single XML file into the ServiceNow platform.
  • ●   Provided our QA team with test criteria to ensure our development efforts were without error and functioning as intended.
Email me for the deliverables
A console based adventure! Java:  Dungeon Game

Context

This was our last assignment in CIS 234 - Object Oriented Design and Java Programming. My goal was to use a combiniation of hashmaps and arrays to both simulate a 2D world and a means to have some way to hold artifacts.

Goals

  • ●   Navigate through a map full of rooms using console commands.
  • ●   Each room must have a name, description, artifact, and a position relative to the other rooms.
  • ●   Create a 'backpack' to store artifacts you've picked up. Implement a carry limit.
  • ●   Drop artifacts in any room. If an artifact is present in that room, swap it.
  • ●   Implement an 'ability' to inspect rooms prior to picking up items.
  • ●   Extra Credit: Change the state of any of the rooms following the removal of an artifact. Both the room description and inspect ability should yield a new message.

Execution

The console based input was implemented via the 'Scanner' class and prompting the user to input recognized commands. This information would be used to designate the next course of action for the program to take by using a series of IF ELSE statements and/or SWITCH statements. Perhaps the most difficult part of the project was finding an efficient way to save and load game data so a user could continue where he or she left off. This would entail that the current room would be saved as well as any artifacts picked up while navigating through the map. In the end, the decision was made to store the relevant data in an external file using a JSON data structure.

View on Google Drive

View Documentation
A Java web applet that has e-store functionality Java:  Store Server Applet

Context

In my next Java class we were given the challenge of making a functional Java based GUI application that could run in a browser. The biggest challenge -- the security issues associated with the Java language and browsers.

Goals

  • ●   Create an interactive 'online' shopping cart.
  • ●   List store inventory in a panel and dynamically update availability.
  • ●   Create GUI components to add, remove, and look up specific details of items.
  • ●   Implement controls for when an item is out of stock
  • ●   Implement store inventory and product-specific attributes via an XML file
  • ●   Have attributes unique to each 'class' of product. The three classes are Music, Videos, and Books.

Execution

The goals above were achieved by instanting GUI components already available in the Java libraries. Event listeners were set up to monitor user interaction with the GUI and called upon other functions defined in the source code. Paring the XML was not a difficult task but required some logic to be able to grab the data in between the beginning and end XML tags. To implement the product classes, concepts such as inheritance, encapsulation, and polymorphism were utilzied.

This project was fun but the constraints associated with Java web based applications can be tricky as most modern browsers take precautions to ensure Java applications do not pose a security threat. In short, if you want to view this project just click the button below, but be warned you will need to open up the Java console and make an exception. Additionally, you will need the JRE Plugin installed to run java in your browser. Fingers crossed.

View on Google Drive

View Documentation
Create, Insert, Query SQL: Small Enterprise Database

Context

My first database class was challenging, but a class in which I learned a great deal about the SQL language and the structure of databases. I dived in with Microsoft SQL Server 2012 and quickly learned how to query information from a database. Prior to this, I constructed the database with erWIN Modelling software.

Goals

This quarter long project had three iterations with each expanding on the requirements of the database. The first objective was to take the provided data and organize it accordingly via 3rd Normal Form. The result would be an effective data structure that would guarantee integrity. Extensive planning would be undergone to ensure that relationships between entities were correct and foreign keys were in the appropriate tables. Additionally, certain constraints would need to be put in place such as data validation upon entry and cascading rules for when entries are to be removed.

Execution

  • ●   Matched the provided attributes with the entities that would yield the best data structure..
  • ●   Used 3rd Normal Form to reduce redundancy and improve data structure.
  • ●   Used erWIN modeler to build the table structure. Ensure proper data types are being used.
  • ●   Used erWin modeler to draw relationships between tables (One-to-one, many-to-many, one-to-many).
  • ●   Set up Microsoft SQL Server 2012 and had erWIN push the generated schema to my server.
  • ●   Ran extensive SQL queries to ensure data was intact and verified that the results of each query were correct.
View on Google Drive
Acquisition, Analysis, Reporting Digital Forensics:   Dead Box Analysis

Context

This class focused on a niche area of forensics commonly referred to as 'Deadbox Forensics'. Although this was a group assignment, I spent much time sifting through data and documenting every little step I took.

The Details

Our group was provided bit-stream images of hard drives belonging to two employees and a corporate server. Our job was to acquire the data and try to identify any malicious activity that may have taken place. The premise was based on the accusation that the two employees were planning to take part in an embezzelment scheme. We started to get to work and gave our full attention to the latter two of the three phases: Analysis and Reporting. After many hours, our class held a mock court in which the judge, our professor, interrogated every group regarding the case.

Execution

  • ●   Used professional grade software such as FTK and EnCase to parse through large volumes of data.
  • ●   Attempted to see concrete relationships between any significant findings.
  • ●   Carefully documented each step and attribute of any potential evidence (Found where in FTK, OS Path, MD5/SHA1, Screenshots)
  • ●   Compiled our documentation into a lengthy report that categorized our findings. Attached an appendix with screenshots.
  • ●   Viewed server logs and attempted to match IP address and timestamps to the appropriate activities.
View on Google Drive
Solving the puzzle Information Technology Competition:   Digital Forensics

Context

This competition was put together by our student club, Management Information Systems Student Association and other industry professionals. Although I am passionate about Offensive/Defensive Cyber Security, I learned a lot about teamwork through this competition and the importance of patience and perserverance.

The Details

Our objective was to parse data and see if any malicious activity was taking place on three separate bit-stream images. The narrative featured three potential suspects whom attempted to collaborate together to commit a variety of crimes. While no substantial evidence was found to convict the suspects, there was plenty of reason to believe malicious activity was taking place.

Execution

  • ●   Parsed through three large volumes of data using FTK.
  • ●   Uncovered the use of a technique known as 'Steganography' being used to hide data within images
  • ●   Reconstructed a damanged VMware image and uncovered many blackhat tools.
  • ●   Documented every single finding and unique attribute (Location in FTK, OS path, MD5/SHA1, Screenshots)
  • ●   Drafted a conclusive report that included a map of said activity, a suspect diagram, and categorized our findings.
  • ●   Presented our report to a panel of five judges from the IT industry
View on Google Drive
Red Teaming Activites Offensive Security: Capture the Flag

Context

This project consisted of two offensive exercises each with their own reports. For both exercises, I acted as an ethical hacker attempting to break into a Windows box and another vulnerable distro known as 'Metasploitable'. In these exercises, a series of 'flags' uncovered hints about the next obstacle. Despite these hints, a fair amount of recon was involved and Kali Linux quickly became the tool of choice that assisted with my initial profiling of the vulnerable systems. I used Kali Linux and virtualized the vulnerable distributions via VMware on the same virtual network.

The Details

During my analysis of the systems, I took a common industry approach by choosing to start off the reconnasaince phase with automated tools such as nMAP. While automated tools often lack the depth of manual analysis, they can cover a large attack surface which would ultimately allow me to narrow my scope and make use of the small time window provided. For both of these assignments, the flags came quickly but some were more difficult than others. Regardless, all flags were found and I took a lot from this activity.

Execution

  • ●   Spun up Kali Linux and the Windows ISO in VMware. Configured both on the same network.
  • ●   Started port scanning with nMAP for the purpose of recon.
  • ●   Took a common vulnerability scanner known as Greenbone OpenVas and narrowed down the attack surface.
  • ●   Dumped hashes using an SMB exploit via Armitage and proceed to crack them.
  • ●   Reconfigured IIS IP address configuration to expose a webserver pointing at a static IP.
  • ●   Used Hydra to bruteforce the authentication mechanism for a weak FTP protocol.
  • ●   Drafted a findings report with remediation suggestions.
View the reports on Google Drive


Volunteer Experience

Volunteering is an integral part of our society and is embedded into our campus culture. As a volunteer, you offer your time and effort and in return gain a great deal of knowledge and experience.

Sponsored by FireEye, SHI, and more CPP Cyber Security Fair 2015

Context

Every year Cal Poly Pomona hosts an annual Cyber Security Fair as a part of our school tradition. Our fair hosts guest speakers, industry leaders, and serves as an open invitation to discuss the latest information security issues. Regardless of if you are in involved in industry, government, or just use the Internet for personal use, it is important to take proactive measures to ensure your data is safe.

Fair Activities

  • ●   As a team, we coordinated the setup and takedown of all physical assets pertaining to the fair.
  • ●   Facilitated discussion with students and industry leaders regarding important information security topics.
  • ●   Distribute prizes upon completion of a survery regarding personal security of fair participants.
  • ●   Tallyed votes for each student who showcased an information security poster for paticipants to view and vote on.
  • ●   Worked closely with Dr. Maria, an expert on postive-psychology, to improve our demeanor and effectiveness of social interactions.
  • ●   Ultimately impressed senior staff and were recognized for our success.
View my certificate