Contents

Analyzing Github Stats via Clickhouse via Chat

TLDR: I prompt-engineered the system prompt in chatcraft to turn it into a github analytics tool: github analyst chat .

Here are the top 10 GitHub projects that have received more stars in the last 2 days than in the previous 2 weeks

Longer Story

  1. I suggested to David that we have a newsfeed of new features on chatcraft.org . He replied that we should try to use chatcraft.org to generate those.

  2. Found simonw’s blog post on using a public clickhouse instance.

  3. Used David’s new “edit system prompt” feature in combination with my “Run Code” feature with the new-sh chatgpt-16k model to chat with the rather wide github history table in clickhouse.

    UI is a work in progress, but one can read above chat in the following way:

    a. chatgpt generates JS code to perform clickhouse SQL queries.

    b. User clicks “Run Code” button to execute the generated JS (with no sandbox). Output of the JS code is then fed back into chatgpt.

    This works similarly to the new function-calling feature that openai released.

  4. Completely failed to ship a “what’s new” feature. After having too much fun with clickhouse I managed to get a prototype for summarization of commits to work too. Turns out github sets friendly CORS flags on their commit history api. chatgpt model helpfully suggested that :) Here is that demo .

Why Bother?

The only way to get a feel for the chat-assisted future is to build it. For us chatcraft is already way more productive than chatgpt. We are only scratching the surface of what’s possible.

Update 2023-07-10: Gitstagram Concept

In the process of exploring the github clickhouse dataset I found some pictures were showing up in my markdown. I decided to make a joke sideproject: Gitstagram - mindless scrolling for dev types. It’s a fun way to get a reading of what’s up on github, found a few new and reconnected with a few past favorite projects already. It’s a feed of latest images within github comments.

I was able to get chatcraft to explain how to restructure my clickhouse scrape-for-images query to use an index which made it fast enough for rendering a web page. I was hesitating to learn clickhouse for ~4 years now, it’s pure joy to learn an exotic SQL system via an ai assistant. Nice to able to learn cool new tech on a whim!