(Unofficial) Weekend Project / Hackathon Ideas

Weekend Projects are a great idea. I am creating a thread specifically for ideas for them so as not to pollute the threads outside of their specific context…

(Unofficial) Weekend Project / Hackathon 1: JFK Files
Winner: @jochenschultz

(Unofficial) Weekend Project / Hackathon 2: Glass Beads
Winner @_j

4 Likes

Thank you! That’s a great idea. I highly appreciate that.

Let me add something:

In case you have any ideas it would also be nice if you could put a tiny bit of work into it and at least search for and put a link to a data source that might be worth to explore.

For example a load of pdf or image files that are publicly available (and copyright free).

Or if you have a company and need some (suitable) company data to be analysed then why not sponsor some openai API credits so it is more accessible for everyone. For example put your cancer research stuff that noone could solve in your company and I am sure the hive can find a way.

Btw. here is a great source to search for datasets:

2 Likes

I declare @jochenschultz the winner of the first one! Yes, it got derailed a bit, but, as @phyde1001 said, I think the idea is solid?

I marked him as solution for that one and closed it.

I pass the torch for #2!

3 Likes

@PaulBellow

A wonderful idea. Maybe the next one should be maintained by Regulars+?

There can be a “communications” thread free-for-all? The high signal content in my opinion should be curated for people actually participating

3 Likes

I’m all for this… and not just 'cause I’m busy haha…

Seriously, though, we could use this thread to plan them out maybe?

I think my first one went sideways as it’s a bit controversial, but in my defense, it was a huge new dataset! :wink:

1 Like

I am in no way trying to take over. Separate interactive communications was entirely my motivation for this thread.

I think maybe there are different levels at which people might participate on an intellectual, professional and fun basis which might help to break down barriers and engage a broader audience.

(For anyone wondering you can’t import the PDFs directly into a GPT (Because it only sees an image in them) but you can import a screenshot image which it will then OCR - Smart bunch at OpenAI but still a need to train up Tech Support ^^)

I see a value in a one line GPT that can do some analysis on a single document and another value in a system that can run deeper analysis on thousands of documents… The real point is both systems are built on deep layers of code designed by others.

Not everyone will have the time or the credit balance to analyse 1000s of images for a fun weekend project, maybe wider engagement has a different value.

Finding the right question is one step away from finding the right answer.

2 Likes

No, starting a meta-thread on it was a good idea, I think. We do some of this stuff in the Lounge occasionally, but it’s good to do it here too. :slight_smile:

The last one went off the rails a bit, but it might’ve been partially my fault for such a controversial subject. (But again, it was just a new/varied dataset, so…)

Anyway, I hope this doesn’t die. I don’t care who is “in charge,” but I think it could benefit our community in a lot of ways… and be fun too!

Maybe we could split… 2 threads… one for ChatGPT “solutions” vs the API “custom solutions”? Hrm…

If this starts to get traction, we might be able to sweet-talk OpenAI into free credits or something? No promises!

But yeah… something fun to do here!

2 Likes

Anyone have an idea for this weekend?

Something more open-ended this time? Custom GPT solution vs homebrew?

Friday approacheth! :slight_smile:

1 Like

I just had a crazy idea…

It’s kinda literary…

Glass Bead Game? Small smile.

By Herman Hesse…

We could try to use AI to make the Glass Bead Game? haha

Here's a GPT-4.5 rundown...

Glass Bead Game Rundown:

The Glass Bead Game, originally from Hermann Hesse’s novel The Glass Bead Game (Magister Ludi), is an intellectual game that synthesizes knowledge across multiple fields like mathematics, art, science, philosophy, and music. Participants creatively link seemingly unrelated ideas through metaphorical and thematic connections, forming beautiful intellectual sequences. It’s less about competition and more about exploring meaningful interdisciplinary insights.

Core Concepts:

  • Interdisciplinary synthesis: Players create connections between diverse disciplines.
  • Symbolic language: The game uses symbolic representations (Glass Beads) for complex concepts or ideas.
  • Collaborative reflection: Focus on meaningful connections rather than competition.

AI Hackathon Adaptation Idea:

Transforming the Glass Bead Game into an AI Hackathon invites participants to build digital platforms or AI tools that capture the spirit of the original game—emphasizing creativity, interdisciplinary insight, and meaningful connection rather than just optimization or problem-solving.

How it might work practically:

  1. Kick-off:
  • Introduction to the Glass Bead Game’s philosophy.
  • Teams brainstorm how to digitally implement interdisciplinary knowledge synthesis.
  1. Teams:
  • Mixed groups of programmers, AI developers, artists, writers, philosophers, and researchers.
  1. Goal of the hackathon:
  • Develop prototype software or AI platforms enabling participants to creatively connect and synthesize concepts across fields (science, art, literature, math, philosophy).
  • Example: AI-powered “concept cards” or “digital beads” that dynamically interact to form narrative sequences or intellectual chains.
  1. Techniques encouraged:
  • NLP (Natural Language Processing) for semantic connections.
  • Knowledge graphs to visualize interdisciplinary links.
  • Generative AI for creating dynamic narrative structures or artistic interpretations.
  1. Evaluation criteria:
  • Creativity & originality in connecting ideas.
  • Depth and elegance of interdisciplinary connections.
  • User experience—ease of use, aesthetic appeal, and enjoyment.

Example Projects:

  • GlassChain: AI-powered web app using NLP to automatically suggest thematic links between ideas across fields.
  • Synthesia.ai: Platform visually mapping conceptual connections between artistic, scientific, and philosophical concepts.
  • Conceptual Symphony: Generative AI produces multimedia (visual/music/text) compositions based on players’ interdisciplinary inputs.

Benefits:

  • Encourages creative thinking beyond standard hackathon problem-solving.
  • Promotes cross-disciplinary collaboration.
  • Yields innovative AI-powered educational or creative tools.

Would you like more detail on how you might structure the hackathon, or explore a different aspect?

2 Likes

I really like the idea of interdiscipline teams.
Combining at least one developer with domain experts from different backgrounds would make sure both parties can contribute.

Using vibe coding techniques where the domain experts create a naive implementation and the developer can guide them into best practises (and help when the model capacities come to their bounderies).

And I also agree that the general idea might be a little bit too much for a weekend hackathon.

A single part of that on the other hand would be perfect.

2 Likes

It was just impossible to get O1-Pro to fulfill this writing assignment, so hackathon challenge 1 - find the right AI model that turns everything I would never write for an hour plus (but still have to write for extended periods of explaining) into:


Hackathon coding challenge proposal

Python developers building next-generation networking middleware increasingly choose Trio’s structured concurrency for its clarity, reliability, and predictable performance at high concurrency, a replacement for asyncio. But today, they face a critical barrier: the lack of a native Trio-compatible HTTP/3 and QUIC implementation—currently available only via asyncio-based libraries like aioquic. This hackathon challenge will bridge this gap, enabling developers to fully leverage HTTP/3 multiplexing, robust UDP support, and TLS 1.3 encryption directly within Trio’s powerful concurrency model. Participating teams will build essential extensions to Trio’s networking stack, integrate cryptographic handshakes, and adapt or rewrite aioquic to natively support Trio. Success here unlocks transformative possibilities in Python network middleware, empowering clearer code, enhanced performance, and rapid adoption of Trio for modern web applications.

Read More

Hackathon Challenge Proposal

Bringing HTTP/3 to Trio: Closing the Gap from Asyncio to Structured Concurrency


Introduction & Background

In the Python ecosystem, asynchronous programming has become the cornerstone for building performant and scalable network applications, middleware, and APIs. Two dominant paradigms have emerged:

  1. Asyncio: Python’s built-in asynchronous library featuring event loops, tasks, futures, and callbacks. Libraries like aioquic—currently the leading Python QUIC/HTTP3 implementation—are built explicitly around asyncio.

  2. Trio: A newer, innovative structured-concurrency-based asynchronous library that offers clearer, safer, and more maintainable concurrency models through constructs like nurseries and strict cancellation scopes. Trio’s architectural strengths make it particularly suitable for high-performance, high-bandwidth, mission-critical tasks—like customer authentication proxies, API-to-API backend microservices, and real-time middleware. These applications often require predictability in resource usage, streamlined error handling, and graceful shutdown—all strong points for Trio.

However, a significant gap remains:

  • HTTP/3 and QUIC (built upon UDP and TLS 1.3) have gained rapid industry adoption, providing multiplexing of streams, faster handshakes, reduced latency, built-in encryption, and overcoming TCP’s head-of-line blocking issues. Yet, the Python ecosystem’s leading QUIC library, aioquic, currently relies exclusively upon asyncio. It does not readily integrate into the Trio ecosystem.

This disconnect limits Python developers who prefer structured concurrency with Trio from leveraging modern HTTP/3 protocols easily and cleanly.

It provides a great opportunity for an AI-empowered hackathon development team to show their ability to implement ideas rapidly and cohesively with the ability of latest artificial intelligence agents to plan, strategize, and implement whole concepts on existing large code bases.


Why This is a Worthy Hackathon Challenge

This challenge idea—porting HTTP/3 and QUIC functionality into the Trio ecosystem—perfectly captures an important and timely intersection of modern networking technology and Python concurrency paradigms. Here’s why:

1. High Industry Demand

  • HTTP/3 and QUIC have moved beyond experimental stages and are in active, widespread deployment at global-scale platforms (e.g., Google, Cloudflare, Facebook). Python application developers working on cutting-edge middleware, proxies, API gateways, and microservice backends increasingly need QUIC/HTTP3 integration. Bringing this directly into Trio would rapidly expand Python’s practical use cases and provide developers a cleaner, safer alternative to asyncio.

2. Benefits of Trio over Asyncio for Network Middleware

  • Trio’s structured concurrency provides clear visibility and control over async tasks, making it ideal for writing robust and fault-tolerant network middleware:
    • Predictable Resource Management: Nurseries explicitly manage the lifetime of concurrent tasks.
    • Graceful Error Handling & Cancellation: Tasks started in Trio’s nurseries automatically handle exceptions, cleanup, and cancellations elegantly.
    • Readability & Maintainability: Clearer code structure simplifies debugging, maintaining, and scaling complex asynchronous applications.
    • Performance Stability: High-bandwidth, high-concurrency networking workloads (such as authentication proxies, API gateways, or backend middleware) can significantly benefit from Trio’s predictable resource usage and concurrency guarantees.

3. Urgent Need for HTTP/3 Ecosystem Growth in Python

  • aioquic currently dominates the Python QUIC landscape, but its strict coupling to asyncio severely restricts integration with other paradigms. There is no practical alternative right now for native, full-featured HTTP/3 support directly within Trio’s ecosystem.

4. Community Opportunity & Contribution

  • Building or adapting aioquic for Trio integration creates tangible value for the broader Python community:
    • Improves diversity in Python networking libraries.
    • Encourages innovative contributions around structured concurrency.
    • Provides fresh learning and growth opportunities in asynchronous networking, cryptographic integrations, and protocol-level implementations.

Coding Effort Scope & Focus

1. Gap Analysis & “Glue” Needed

The major gap that teams will address includes:

  • Trio UDP Support:

    • Trio provides basic UDP support, but QUIC protocol’s sophisticated demands (precise packet-level handling, custom packet scheduling, retransmission strategies, congestion control, MTU discovery, advanced socket configuration, etc.) are not fully accommodated.
    • Glue needed: Extend Trio’s UDP networking layer for advanced low-level socket options and precise packet handling necessary for QUIC.
  • TLS 1.3 Integration with Trio:

    • QUIC fundamentally relies on tight integration with TLS 1.3, managing key derivation, packet encryption/decryption, session resumption (0-RTT), and ALPN (Application-Layer Protocol Negotiation).
    • Glue needed: Integration or adaptation of existing Python TLS libraries (e.g., OpenSSL through Python bindings or the cryptography library) with Trio’s concurrency and IO abstractions.
  • Adapting or Rebuilding aioquic:

    • The current aioquic library is explicitly asyncio-only. Direct reuse is nontrivial due to differences in concurrency primitives and cancellation/error handling models.
    • Glue needed: Refactoring or rewriting parts of aioquic to use Trio’s structured concurrency primitives, task handling, cancellation scopes, and error propagation.

2. Concrete Deliverables Expected

The hackathon challenge would explicitly target the following achievable, measurable deliverables:

  • Minimal viable UDP enhancements to Trio enabling robust QUIC implementation.
  • TLS 1.3 integration demonstration, compatible with QUIC’s handshake protocol.
  • QUIC Handshake Proof-of-Concept using Trio-native concurrency.
  • Minimal HTTP/3 multiplexed streams prototype, showcasing multiple HTTP/3 concurrent requests & responses within a Trio nursery-based structured concurrency setup.

Suggested Approach & Team Focus Areas

Teams may approach the hackathon from these directions:

  • Library & Infrastructure Team: Extend and improve Trio’s UDP support layer explicitly for QUIC’s networking demands.

  • Cryptography & TLS Integration Team: Bridge existing Python TLS 1.3 libraries into Trio’s async I/O, managing key generation, handshakes, encryption/decryption cleanly within Trio’s event loop and concurrency structure.

  • Protocol Adaptation & aioquic Rewrite Team: Rewrite, refactor, or adapt critical sections of the existing aioquic codebase to run within Trio’s concurrency model—effectively porting asyncio-based coroutine models to Trio nurseries and cancellation handling.

  • Application & Middleware Demonstration Team: Use the newly adapted HTTP/3 and QUIC libraries to demonstrate realistic middleware applications, such as authentication proxies or API gateway samples, proving practical usefulness and improved performance under Trio.


Impact of Successful Completion

If completed successfully, this hackathon challenge would result in:

  • A robust, performant, maintainable HTTP/3 and QUIC implementation for the Trio ecosystem, immediately usable in production scenarios.
  • A significant new contribution to the Python async networking community, promoting structured concurrency as a strong alternative to asyncio.
  • Valuable practical experience and documentation generated by the teams, benefiting the wider Python community and increasing adoption of Trio in production workloads.

Conclusion & Recommendation

This hackathon proposal uniquely combines important Python ecosystem advancements with pragmatic real-world needs. The challenge:

  • Aligns perfectly with Trio’s strengths and modern Python programming practices.
  • Provides critical new capabilities currently unavailable to Python middleware developers.
  • Benefits a wide range of use cases demanding robust, performant, clear asynchronous solutions.

I strongly recommend choosing this challenge idea for the next hackathon. It clearly meets the Python community’s urgent needs, provides a meaningful real-world impact, and promises exciting and productive collaboration opportunities for participants.

Clearly not the kind of thing that would actually be done by anyone on this forum.

It is easy to imagine encouraging engagement with ideas and organization and a framework. However, the ideas are easier to write than the doing. Realistically, nobody is doing the doing. Are OpenAI partners from prior startup funding rounds chiming in here with their developed solutions based on early access to APIs (turned into testimonials)? Negative.

This topic can only speak to a very small number of people that aren’t simply hit-and-run posters. You are speaking to yourselves. The attention to anything will scroll off the screen with the next bug or broken API.

(that AI model - gpt-4.5-preview)

You mean something like this:

run_receiver.py

#!/usr/bin/env python3
import logging
from prod_pipeline.receiver.consumer_app import ConsumerApp
from prod_pipeline.receiver.decoder import Decoder
from prod_pipeline.receiver.http3_receiver import HTTP3Receiver
from prod_pipeline.receiver.quic_receiver import QUICReceiver
from prod_pipeline.receiver.tls_decryptor import TLSDecryptor
from prod_pipeline.receiver.udp_receiver import UDPReceiver

def main():
    logging.basicConfig(level=logging.INFO,
                        format="%(asctime)s [%(levelname)s] %(name)s: %(message)s")
    bind_address = ("0.0.0.0", 9090)
    
    # Build the receiver pipeline:
    consumer_app = ConsumerApp()
    decoder = Decoder(consumer_app=consumer_app)
    http3_receiver = HTTP3Receiver(decoder=decoder)
    quic_receiver = QUICReceiver(http3_receiver=http3_receiver)
    tls_decryptor = TLSDecryptor(quic_receiver=quic_receiver)
    
    udp_receiver = UDPReceiver(bind_address=bind_address, tls_decryptor=tls_decryptor)
    
    logging.info("Receiver pipeline started. Listening for messages on %s", bind_address)
    try:
        udp_receiver.listen()  # Blocking call; listens indefinitely until interrupted.
    except KeyboardInterrupt:
        logging.info("Receiver shutdown requested. Exiting...")

if __name__ == "__main__":
    main()

run_sender.py

#!/usr/bin/env python3
import logging
from prod_pipeline.sender.producer_app import ProducerApp
from prod_pipeline.sender.encoder import Encoder
from prod_pipeline.sender.http3_sender import HTTP3Sender
from prod_pipeline.sender.quic_sender import QUICSender
from prod_pipeline.sender.tls_encryptor import TLSEncryptor
from prod_pipeline.sender.udp_sender import UDPSender
from prod_pipeline.sender.network import Network
import time

def main():
    logging.basicConfig(level=logging.INFO,
                        format="%(asctime)s [%(levelname)s] %(name)s: %(message)s")
    # Ensure that the receiver is already up and listening on port 9090.
    remote_address = ("127.0.0.1", 9090)
    
    # Build the sender pipeline components:
    network = Network(remote_address=remote_address, timeout=5.0)
    udp_sender = UDPSender(network=network)
    tls_encryptor = TLSEncryptor(udp_sender=udp_sender)
    quic_sender = QUICSender(tls_encryptor=tls_encryptor)
    http3_sender = HTTP3Sender(quic_sender=quic_sender, stream_id=9)
    encoder = Encoder(http3_sender=http3_sender)
    producer_app = ProducerApp(encoder=encoder)

    # Pause to ensure receiver is ready (in production you would have more robust coordination)
    time.sleep(1)
    
    # Send a test message:
    producer_app.create_message("Hello, production world!")
    
    # Close the network socket after sending.
    network.close()

if __name__ == "__main__":
    main()

Didn’t really dive into it… just vibe coding. :smile:

The application layer would look more like...
class Client:
    """
    Synchronous HTTP/3 client that spawns a background thread with an event loop.
    Each request is performed asynchronously under the hood, but from the user's
    perspective it's fully synchronous. 
    """
    def __init__(
        self,
        headers: Optional[Dict[str, str]] = None,
        timeout: float = 20.0
    ) -> None:
        self._loop_thread = _LoopThread()
        self._loop_thread.start()
        self._default_headers = headers or {}
        self._timeout = timeout
        self._closed = False

    def __enter__(self) -> "Client":
        return self

    def __exit__(self, exc_type, exc_val, exc_tb) -> None:
        self.close()

    def close(self) -> None:
        if not self._closed:
            self._closed = True
            self._loop_thread.stop()

    def _internal_request(
        self,
        method: str,
        url: str,
        headers: Optional[Dict[str, str]] = None,
        json_body: Optional[Dict[Any, Any]] = None,
        data: Optional[Union[str, bytes]] = None,
        stream: bool = False,
    ) -> Response:
        if not url.startswith("https://"):
            raise ValueError("Only https:// URLs are supported.")
        # We'll reuse the same logic as in AsyncClient, but call it in the background...

I’ve split it up into single components so each of them would be a good example for a hackathon e.g. implementing a proper HTTP/3 stream framing…

This would be a suitable outline?

So maybe this?


import threading
import socket
import time
from typing import Optional, Dict, Any, Union

# Import the sender components of our pipeline
from prod_pipeline.sender.producer_app import ProducerApp
from prod_pipeline.sender.encoder import Encoder
from prod_pipeline.sender.http3_sender import HTTP3Sender
from prod_pipeline.sender.quic_sender import QUICSender
from prod_pipeline.sender.tls_encryptor import TLSEncryptor
from prod_pipeline.sender.udp_sender import UDPSender
from prod_pipeline.sender.network import Network

# Import the receiver components of our pipeline
from prod_pipeline.receiver.consumer_app import ConsumerApp
from prod_pipeline.receiver.decoder import Decoder
from prod_pipeline.receiver.http3_receiver import HTTP3Receiver
from prod_pipeline.receiver.quic_receiver import QUICReceiver
from prod_pipeline.receiver.tls_decryptor import TLSDecryptor
from prod_pipeline.receiver.udp_receiver import UDPReceiver

# A simple Response wrapper for our client.
class Response:
    def __init__(self, status_code: int, content: str, headers: Optional[Dict[str, str]] = None):
        self.status_code = status_code
        self.content = content
        self.headers = headers or {}
    
    def __repr__(self):
        return f"<Response {self.status_code}: {self.content}>"

class Client:
    """
    Synchronous HTTP/3 client built on top of our custom pipeline library.
    It sends a request via the sender pipeline and waits for a response
    using the receiver pipeline. The response is simulated here to show
    how you might integrate the pipelines.
    """
    def __init__(
        self,
        remote_address: Optional[tuple] = None,
        timeout: float = 20.0,
        headers: Optional[Dict[str, str]] = None,
    ) -> None:
        # Use the provided remote address or default to localhost:9090.
        self.remote_address = remote_address or ("127.0.0.1", 9090)
        self.timeout = timeout
        self.headers = headers or {}

        # Build the sender pipeline components.
        self.network = Network(remote_address=self.remote_address, timeout=self.timeout)
        self.udp_sender = UDPSender(network=self.network)
        self.tls_encryptor = TLSEncryptor(udp_sender=self.udp_sender)
        self.quic_sender = QUICSender(tls_encryptor=self.tls_encryptor)
        self.http3_sender = HTTP3Sender(quic_sender=self.quic_sender, stream_id=9)
        self.encoder = Encoder(http3_sender=self.http3_sender)
        self.producer_app = ProducerApp(encoder=self.encoder)

        # Build the receiver pipeline components.
        # Note: In a real implementation the consumer would capture an actual response.
        self.consumer_app = ConsumerApp()
        self.decoder = Decoder(consumer_app=self.consumer_app)
        self.http3_receiver = HTTP3Receiver(decoder=self.decoder)
        self.quic_receiver = QUICReceiver(http3_receiver=self.http3_receiver)
        self.tls_decryptor = TLSDecryptor(quic_receiver=self.quic_receiver)
        # Bind the receiver on the same port for demonstration.
        self.receiver_bind_address = ("127.0.0.1", self.remote_address[1])
        self.udp_receiver = UDPReceiver(bind_address=self.receiver_bind_address, tls_decryptor=self.tls_decryptor)

        # Placeholder for the response.
        self._response = None
        # Launch a background thread that listens for an incoming packet.
        self._receiver_thread = threading.Thread(target=self._listen_for_response, daemon=True)
        self._receiver_thread.start()

    def _listen_for_response(self):
        """
        Listen for a UDP packet corresponding to a response and process
        it through our receiver pipeline. In a real client this would be
        hooked into an established connection mechanism.
        """
        sock = socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
        sock.bind(self.receiver_bind_address)
        sock.settimeout(self.timeout)
        try:
            data, _ = sock.recvfrom(4096)
            # Process the received data through the pipeline.
            self.tls_decryptor.decrypt(data)
            # In our simulation, the ConsumerApp does not record the response.
            # Instead, we generate a dummy response.
            self._response = Response(status_code=200, content="Simulated response")
        except Exception as exc:
            self._response = Response(status_code=500, content=f"Error: {exc}")
        finally:
            sock.close()

    def request(
        self,
        method: str,
        url: str,
        headers: Optional[Dict[str, str]] = None,
        json_body: Optional[Dict[Any, Any]] = None,
        data: Optional[Union[str, bytes]] = None,
        stream: bool = False,
    ) -> Response:
        """
        Synchronously send an HTTP/3 request and wait for the response.
        The outbound request is routed through our sender pipeline while
        the response is received by the receiver pipeline.
        """
        # Here, for demonstration, we simplify the request processing.
        # We pass a formatted string as a message through the pipeline.
        request_message = f"{method} {url}"
        self.producer_app.create_message(request_message)
        
        # Wait for the receiver thread to process an incoming "response".
        self._receiver_thread.join(timeout=self.timeout)
        return self._response

    def close(self) -> None:
        """
        Cleanly close the client, shutting down network resources.
        """
        self.network.close()

call it like this?

import concurrent.futures
from client import Client

def send_request(i: int) -> str:
    """
    Create a client instance, send a GET request to a unique URL,
    and return a string representation of the response.
    """
    with Client(remote_address=("127.0.0.1", 9090), timeout=5.0) as client:
        response = client.request("GET", f"https://example.com/request{i}")
        return f"Request {i}: {response}"

def main():
    num_requests = 50
    with concurrent.futures.ThreadPoolExecutor(max_workers=num_requests) as executor:
        # Launch 50 parallel requests
        futures = [executor.submit(send_request, i) for i in range(num_requests)]
        for future in concurrent.futures.as_completed(futures):
            try:
                result = future.result()
                print(result)
            except Exception as exc:
                print(f"Generated an exception: {exc}")

if __name__ == "__main__":
    main()

That’s not quite the idea. And already done over here…

Replace the symbol with class TrioClient: and go from there :blush: and hit :brick:

Similar with uvloop, but that’s not the target.

Thank you @_j … I have a few suggestions that might address some of your concerns…

Before I do… @PaulBellow has apologised twice for the JFK project, however, clearly I did start the fight (dare I say ‘fired the first shot’ :confused: ), for this I should take responsibility.

  1. Weekend Projects - User @razvan.i.savin addressed the ‘Arrogance’ of these ideas in the JFK thread and it’s nice @_j to see someone of your calibre backing up this sentiment… To imagine that everyone here knows how to setup all the parts of these systems is nonsense, to assume they have use to most people is again nonsense.

  2. Forum Members - Who are the forum members (developers)… Forum Stats say ~1,000,000, last number I saw OpenAI tout just before Christmas was 3,000,000 developers worldwide (maybe there’s an opt-out) and many here are just starting out learning the basics or creating GPTs. What are these members developing, are they super active developers, what is their audience are they keeping their products updated around the forementioned bugs? Are these ‘Weekend Projects’ tailored to them or to an Elite group of members in the Lounge that ^^ may or may not already get free credits as a perk?

  3. Marketing - I have mentioned before I helped build a tool brand in China. I don’t know much about contract politics and stuff, went in to work one day and was told to sign contract in Chinese or loose my visa… What do you do? Say goodbye to your kids?.. Point is though that while there I ended up building and running a $1,000,000 PPC budget, daily running the numbers for 100ks keywords in multiple languages… Marketing is not coding it’s tailoring to your audience and bringing them in with the right medium.

The value of the Developer community on these emails?

Looking forward to this @jochenschultz and looking forward to seeing @OpenAI mentioning this in their weekly Developer emails ^^.

  1. I have more but first will post this to give the Regulars something to chew on and post (maybe a laughable but quite open-ended project) that could be done in code or in GPTs…

Creative Accounting

Concept is for app to accept images/stats/descriptions etc… any data of a product

Task is to:

Break down Monetary and Environmental costs of product based on stats like where it’s from, what it appears to be made of
Come up with alternative materials / purchasing options - not from sites just product alternatives
Come up with alternative ideas for fulfilling the need that may not need money… Goal here is to ensure a reasonably similar/durable etc result

Points for creative solutions… eg. $4 1kg weights replaced with taped $1 1Kg bags rice or empty milk bottles with sand… Could be funny results…

Multi-Modal, could set different levels of difficulty ie from text stats/image inputs etc

The first results of the GPT model actually havn’t been really good - when you read the linked linkedin data you’ll see I have published a pdf with the files that gpt4o-mini gave at least one score of 100/100 - and I didn’t really expect a lot from that one prompt either (each category would require at least a specialized agent/prompt with a better prompt - which obviously is way out of scope for a short “unofficial hackathon”).

I did some other extractions of data with other methods e.g. spaCy.

But what is missing in domain knowledge. Like I stated here

Which basically means that the shooting of jfk happened before most of us were even born and I am not really planning to become a domain expert on that case - I have sent some mails to people who are but got no response.
So for me the case is closed - maybe I’d open it up again if someone comes who knows the right questions - which I would be happy to answer.

Indeed…

So as a non technical solution and without domain specific knowledge this is rather an issue…

That said… For my project I have evidence of domain specific support… And a technical solution…

(This was OpenAI solution…)

A new generation of developers is currently evolving. There are the old guys who still believe that they are better than the models or who only use line or block completion methods while it is already possible to work on a higher level and without the need to learn a special programming language.

So I would say that is a good chance for people to get involved. Old devs might have a little head start - but not for long. Many devs have no idea how to create products or what customers/users needs are and they will have a very hard time to learn how to talk to people.

Also “arrogance”? Really? When I use words like “bullshit” on stuff where this word is by far the most accurate terminology?

I was trying to stop general chatting like this (we are actually doing it again) and tried to do it nicely and tried to do it aggresivly. Won’t do it again. From now on I just flag stuff as spam or offtopic.

1 Like