Product upvotes vs the next 3

Waiting for data. Loading

Product comments vs the next 3

Waiting for data. Loading

Product upvote speed vs the next 3

Waiting for data. Loading

Product upvotes and comments

Waiting for data. Loading

Product vs the next 3

Loading

wrapped.dev

Spotify wrapped for Github Repos with AI

The only music I see is lines of code. We wanted to celebrate the work done on opensource, so we decided to create an AI powered wrap for github repos, not just users! It's quite incredible how much time and effort devs put into things :)

Top comment

Hi folks, Vaibhav and Ethan (@actual_ethan) here! wrapped.dev was a fun escape from work over the holidays and an attempt to build something to celebrate the work done on open-source repositories. What is wrapped.dev? ✨ wrapped.dev analyzes your repository's entire year of development, using AI to understand the quality and impact of code changes. It's like Spotify Wrapped, but for your code! How it works: 1. We git clone a repository 2. Run every file-change in every commit from this year through static analysis checks + llama-70b to build insights 3. Build a year in review via some aggregations and also a 1-shot prompt from the insights gleaned More about the AI Tech stack: Actual AI: Used to gather repository statistics that we feed into the LLM context window, and programmatically truncate commit information. It uses a combination of static analysis and LLM to get the most accurate information. It's not about lines of code, the number of commits, the story points - it's quality of the code you shipped. BAML: All our prompting and LLM selection is done in BAML. Instead of hoping the LLM will follow the instructions in the prompt, BAML guarantees it with algorithms run on top of any LLM. Gemini 1.5 Pro: Gemini's huge 2M context window is amazing! While expensive, it works well for general analysis. Bedrock + Llama 3.3 70B: We would have been screwed without open-source LLMs 🤑. We use smaller models for analyzing every file-change in every commit. AWS Bedrock made it super easy and cheap to host! Why we did this? We became friends here in the Seattle founder community and wanted something fun to do during the holidays that wasn't work (but secretly was -- founder life 🙃). For context, Ethan is the co-founder over at actual.ai which built a lot of the commit analysis pipelines. Vaibhav is one of the creators of BAML (an open-source LLM framework, built in rust, and especially good for structured data). Is this free? Not for us 😅 But this was just really really cool to us... Send us some coffee if you want or dont. we did this for fun, and we wanted to share it with you. (Since this does cost us real money, we do manually review the repo requests before running them!). Can I see the prompts? Yes! We're open sourcing all the wrapped.dev prompts soon! Let us know what you think in the comments—feedback, feature requests, you name it. We hope you have as much fun checking out these repo “wrapped” pages as we did making them! Cheers, Vaibhav & Ethan