I have a postgresql database that I manage with many schemas and hundreds of tables and thousands of columns. Going through all the tables and figuring out which columns should be indexed would be a very time consuming task. A simple way to go about adding indexes that are likely appropriate is to leverage postgres' metadata tables.
The query below queries for all tables for columns that contain "id" "_at" or columns with the name "created_at" and returns the SQL statements needed to have indexes created on those columns if the index doesn't already exist.
If the column is already indexed then an indexing statement won't be generated. This filtering is accomplished in the last portion of the query following the "AND column_name || table_name not in"....
A better way to go about adding indexes to an existing database is analyzing some sort of usage statistics data. Something like pg_qualstats. However this is a pretty simple approach if you don't want to spend time adding packages like pg_qualstats.