Product upvotes vs the next 3

Waiting for data. Loading

Product comments vs the next 3

Waiting for data. Loading

Product upvote speed vs the next 3

Waiting for data. Loading

Product upvotes and comments

Waiting for data. Loading

Product vs the next 3

Loading

Aqueduct

The easiest way to run open source LLMs

Aqueduct's LLM support makes it easy for you to run open-source LLMs on any infrastructure that you use. With a single API call, you can run an LLM on a single prompt or even on a whole dataset!

Top comment

Hi everyone! LLMs have taken the world by storm, but using them is a pain (or a non-starter) for most people, due to concerns around data privacy, IP ownership, and cost. Open-source LLMs, like LLaMa, Dolly, and Vicuna have enabled enterprises to think about using LLMs, but they're a pain to operate. At Aqueduct, our goal has been to enable ML teams to use the best technology without the operational nightmare of running ML in the cloud, and we're super excited to share that Aqueduct now allows you to run open-source LLMs with a single API call. ➡️ Aqueduct's Python API allows you to call an LLM with a single line of code (see the first image above). No need to worry about installing drivers and library dependencies and debugging configuration parameters. ☁️ Aqueduct is designed to work with any infrastructure you use; you can run your LLMs on a large server or on a Kubernetes cluster. You can even have Aqueduct spin up a cluster for you. 🔁 You can publish your LLM-based workflows to run ad hoc or on a fixed schedule using Aqueduct. 💡 Aqueduct's visibility features extend naturally to LLMs, so you can see what parameters or prompts you used and how performance evolves over time. We'd love to hear what you think! Check out our open-source project or join our Slack community. GitHub: https://github.com/aqueducthq/aq... Slack: https://slack.aqueducthq.com

About Aqueduct on Product Hunt

The easiest way to run open source LLMs

Aqueduct launched on Product Hunt on May 10th, 2023 and earned 107 upvotes and 9 comments, placing #17 on the daily leaderboard. Aqueduct's LLM support makes it easy for you to run open-source LLMs on any infrastructure that you use. With a single API call, you can run an LLM on a single prompt or even on a whole dataset!

On the analytics side, Aqueduct competes within Software Engineering, Developer Tools, Artificial Intelligence and GitHub — topics that collectively have 1.1M followers on Product Hunt. The dashboard above tracks how Aqueduct performed against the three products that launched closest to it on the same day.

Who hunted Aqueduct?

Aqueduct was hunted by Vikram Sreekanti. A “hunter” on Product Hunt is the community member who submits a product to the platform — uploading the images, the link, and tagging the makers behind it. Hunters typically write the first comment explaining why a product is worth attention, and their followers are notified the moment they post. Around 79% of featured launches on Product Hunt are self-hunted by their makers, but a well-known hunter still acts as a signal of quality to the rest of the community. See the full all-time top hunters leaderboard to discover who is shaping the Product Hunt ecosystem.

Reviews

Aqueduct has received 1 review on Product Hunt with an average rating of 5.00/5. Read all reviews on Product Hunt.

For a complete overview of Aqueduct including community comment highlights and product details, visit the product overview.