Kimmel & Construction Jobs

Kimmel & Associates

With Kimmel and Associates, I grew a lot as a programmer, picked up real world usage of a different programming language (ruby), used several new frameworks (laravel for php, and ruby on rails). Probably the best thing that happened was to help get me past my old irrational dislike of frameworks.

Construction Jobs

Construction Jobs was quite fun, I was a team of one, I built a replacement for their old website, which included a very carefully crafted migration script of my own design to preserve all the data from their old job board system. After the website I built was lunched, I spent over another year upgrading it with a never ending list of new features. I had such a lovely boss who was able to clearly express desired features for the website, and while the priorities would shift, sometimes daily, there was never any pressure to work long into the night.

Building the website was pretty neat, we used a service to parse job seekers resumes so that we could have all the individual pieces of data and hiring managers would have the option to see everyone’s resume consistently, to make it easier to compare candidates; it also meant that job seekers only had to review their information, after uploading their resume, rather than having to type it all in again.

The resume parser we used had a SOAP interface and returned XML, which was quite extensive, and beyond just giving the information, also gave a confidence score on each piece of extracted information, which allowed us to flag for the user fields we wanted to ensure were reviewed by a human. Having all the resumes split allowed our resume search to be quite powerful, letting employers search by job title, education, years of experience, and such; it was backed by Solr.

The most frequently requested feature that I handled was automatically importing job postings from a construction company’s website.

Very rarely would the careers section have anything like a feed, so 90% of the time, it was up to me to build a web scraper which had to build a list of all the jobs, figure out if there was an existing record to update, or if it was a new job. There was also the matter of sorting out which jobs were no longer posted, so that we would remove them from our website.

Beyond the complexities of mimicking a user visiting a website programmatically, a big piece of it, was determining problems and making sure I was alerted, so I could make corrections to my code to compensate for changes companies made to their websites.