I'm Garrett.

picOfMe

About Me

Hi, I'm Garrett. I'm 25 years old and graduated from the University of Maryland with a B.S. in Physics and a B.S. in Mathematics in 2016.

Curiosity drives my life. Why are things the way they are? How does this thing work...or that! How could we build something better? What is a more complete and elegant solution? I grew up flipping houses with my father, and I learned I have a love for building things. Today, I still love designing, building, and implementing solutions.

The intersection of finance and technology continues to be of great interest to me. I have followed markets intently since college, and I love data analysis and the coding that it requires to achieve the appropriate level of accuracy and consistency. I crafted this website to showcase some of my abilities and to host future projects. If you would like to reach out with a project idea, to collaborate, or just to say hi, please send me a message under the contact section! (garrett2222@gmail.com as the contact section may be broken 1.5 yrs later)



Projects

Technical Report

Geomagnetic Storms' Effect on the US Electrical Grid

Lab Reports

Charge to Mass Ratio of an Electron

Refraction of Light

Gaussian Beam Propagation with Lenses

Polarization of Light

Michelson Interferometer

Diffraction

Spectroscopy & Atomic Spectra

Neural Network

I am currently working on a simple Multi-Layer forward feed neural network in python. The way the world has been going, I thought I should learn about neural networks and what better way to learn than to create your own! As of now, the neural network is meant to be a continuation of the short report scraper. The idea being that I can feed the network a number of stock/short report parameters on the ticker printed (returned) by the scraper. Those parameters (pulled from various sites) are run through the neural network, which spit out a guess of how far the stock should move to the downside.

Below is a link to the full project. It's run from the scraper, which feeds the ticker of a new short report to a data pulling python file which then feeds the normalized data to the neural network. Ultimately the project returns the new short report headline, the ticker, and the neural networks guess as to how far the stock will move. As a trader, each of these pieces of information aid my trading. However, I am currently trying to find a work around for the data pulling portion as certain tickers break the code because pandas data reader and the yahoo finance API became antiquated recently.

Eventually, after perfecting the data pulling, and implementing some fail-safes, I hope to be able to link all of this code to Interactive Brokers as an algorithmic trading strategy. Granted, I'll need lots of testing and tweaking for everything to run smoothly through the IbPy API (and let's not forget that 10 grand to open an account), but I do believe it could be a very profitable trading algorithm.

picOfautoShort
Short Report Scraper

As I mentioned, I worked at a proprietary trading firm. There was a good opportunity to make money when a reputable source released a new short report. However, the trade was only really great it you could get the ticker of the report as soon as the report was published. If you were 2 seconds late, your prints would be very poor (skewing the risk reward) and the trade might not even be profitable.

I was dissatisfied with speed and consistency of the 3rd party scraper that most traders used, so I set out to build a better one. While most websites dislike, or eventually block you, for scrapping their websites at sub-second intervals I knew that it was possible. Using object oriented python I built a script that put each website on its own thread and then split the threads across all four cores — again, fractions of seconds were extremely important.

The scraper I built is able to check each of the eight websites that I like to scrape roughly 4 times a second. Some sites ares faster than others, but this is due the response time of the websites' servers. When a new report is posted, my scraper prints the source, the new headline, the ticker (if available), produces an audible beep (to alert me), and opens the webpage in a new broswer window. All of these features enabled me to get to the ticker of the report and execute faster than with the 3rd party scraper.

picOfScraper

If you checked out the physics tab you have probably seen my experience in LaTeX. Most of my lab reports in college were required to be written in LaTeX, even some of the homeworks (for the TA's convenience). Regrettably, the first week after college my computer was stolen while friends and I were out celebrating at the beach. Thus, most of the base LaTeX files for the lab reports are gone. Luckily, I found the base LaTeX file for one of the most extensive lab reports in my email. Below is the base LaTeX file for the "Charge to Mass Ratio of an Electron" under the Physics tab.

Charge to Mass Ratio of an Electron Base File

If you checked out the physics tab, you probably noticed that the majority of the analysis and simulation in done with Matlab. The technical report in particular has a simulation of what would happen to a very simplified version of the U.S. electrical grid when exposed to a large geomagnetic induced current. Below you can see that Matlab code and tinker if you would like.

picOfData
Double Pendulum

Because I entered the University of Maryland as a physics major, the coding classes that I took were Matlab based. Matlab was the tool of choice for most classes that required some analysis and data manipulation.

I built a program that simulated the motion of a double pendulum and output a variety of analysis about the motion as one of my first Matlab projects. It seemed like a perfect intersection of my courses at the time. Please remember that while the code is crude this was one of my first projects using Matlab.

picOfData


Resume

Skills

FRONT END

picOfSkill
picOfSkill
picOfSkill
picOfSkill

BACK END

picOfSkill
picOfSkill

OTHER

picOfSkill
picOfSkill


Contact Me