Medium badge Automation Machine

Proof of Automation

Only editable by group admins

  • Last updated December 14, 2017 at 12:42 PM
  • Evidence visible to public
Write up a summary about how you were able to change the process, and how the current automation process works.

All posted evidence

Payable Time folder (left), Overtime Dashboard (middle), Query Editor with filepath (right)

Preview ot dashboard
Small img 18152 jeremygibson 22 days ago

Overtime Dashboard Automation for PWA

We generate a monthly report for departmental leadership on paid overtime. This was previously a manual process and could take a few hours to complete, as we had to pair employee names with division and subdivision. Using Power BI, I improved three things.
  • Created a psuedo database using Get Data type 'Folder' where I can drop new payable time files into.
  • Created a psuedo database using Get Data type 'Folder' where I drop new PCRs (excel files with employee departments, etc.)
  • Connected the two with a relationship through Power Bi to be able to present both overtime and departments.

This let me automate the report and turn a few hour process into a 5 minute update.
Small img 18152 jeremygibson 22 days ago

Automated process of running letter, reports and verifying that all of the letters ran correctly.

Prior to developing this automation, all of our property maintenance letters had to be run by an individual in crystal reports and exported to a pdf daily.  They then had to run another report and manually verify that each letter that should have been included in one of the 7 letter reports was included in that report.  They then had to highlight the missing letters for another staff member to manually fix.  I identified existing but unused functionality in an existing software system (infoview) that allowed us to schedule reports to automatically run and email nightly.  This saved several minutes each day as staff no longer had to wait for each report to run they could just download the pdf from their email in the morning.  I also developed a way to automate the highlighting of the missing letters in the verification report, which saved a staff member a couple hours of time each morning identifying the missing letters and significantly reduced the number of errors that were made in identifying missing letters.  To further reduce staff time spent fixing the errors and running the missing letters individually another report was developed and sent out at the end of each day to all of our property maintenance inspectors letting them know about inspections that were missing the close log stop time which was one of the primary drivers of missing letters.  This automated report allowed the records to be corrected same day and prior to the letter reports running further reducing the number of errors and the time taken to identify and correct our letters daily
Small 42bb2585dba2324a98303fec557521ca bbande01 6 months ago

Posted Image

Preview automation report
Small 3e43cad1aa70d4ba82dbeb55172a774c kengodson About 1 year ago

Public Health and Wellness Childhood Lead Poisoning Prevention Database Application

     The Department of Public health and Wellness needed a way to automate gathering and management of data related to the Childhood Lead Poisoning Prevention Program.  The need was to track information through the process of Blood Lead Testing, Follow Up Testing, referrals to other professionals, Home Visits, and more.  In addition they needed reports that would display Blood Lead tests inputted into the system the day before, their activities, and the status of individuals moving through the evaluation and treatment process.
     I created a Microsoft Access front end and SQL Server back end database application to do the job.  In addition I used Microsoft SQL Server reporting Services to meet their reporting needs.
     This database application is now used as a reference by the Commonwealth of Kentucky.  

Small img 0561 kaforski Over 1 year ago

Wrote short Arcpy script to automate the updating of a map

There is a map that is used in PDS to display short term rental registrations which needs to be periodically updated.  I wrote a short Arcpy script that pulls the new data, geocodes it, and updates the source shapefiles.  I realize that this is kindergarten-level coding but I knew zero Python when I started and I am happy with how it turned out.  It will save the user who maintains the map some tedium.
Small dante head shot2 dstgermain Over 1 year ago

Project Manager for Tolemi's Building Block work which has automated several key data sets for The Vacant & Public Property Administration.

Prior to our four (4) year engagement with Tolemi (formerly known as Opportunity Space), our division had very little insight on how to monitor the performance of our programs. The lack of efficient data and an organized platform to analyze this resulted in mediocre property sales for several years.  Through our engagement with Tolemi, we were able to utilize Open Data and track information such as property sales, foreclosures, demolitions and transfer that information onto their web-based GIS platform.  I worked with Tolemi as a divisional Sharepoint Administrator and GIS Analyst to ensure the accuracy of the information.  We have since automated all of the data sets aforementioned and have create programming in response to information discovered through its analysis.  Last year our sales jumped up 203% through the creation of a vacant home sales program called Last Look. 
Small joshua watkins 180x180 jwatkins Almost 2 years ago

Automated dataset pull for OMB and gave them a great looking interface to search through the data.

Some of this is private so I will be very vague with the explanation. OMB needed a way to look through a dataset... The problem is that the dataset was being provided as a TXT file in an FTP server, so they would have to manually connect via FTP and download the file, and open it up with excel. I created a cron job that runs a bash script that I wrote from scratch to connect to the FTP, download the newest file, save it to a particular directory, and then do a CURL request to a particular website that then consumes the TXT file, parses it and inserts it into a database. OMB is then able to log in to the website that I created for this and search through the data in order to perform one of their daily tasks. Also, this web application will also be used by Codes and Regulations :). If anyone working for Louisville Metro would like to see this web application, as well as the cron job, contact me for a demo.
Small img 0040 jorgefelico Almost 2 years ago

Automated uploading of datasets to the open data portal, and subsequent ingestion into the catalog.

I created a text file of urls that is processed by cron on data.louisvilleky.gov, that loops through each file url, downloads and imports it into the Open Data catalog. From there, it uses drush to automatically import a select group of datasets into MySQL, which also makes those datasets available via a REST API.
Small 18673213 10159101362455157 3411827820256446276 o mattgolsen Almost 2 years ago