Hacker Newsnew | past | comments | ask | show | jobs | submit | hassaanr's commentslogin


engine braking does not have any effect on the clutch, other than while downshifting, which if you're blipping/rev-matching has practically no effect


Exactly, if your clutch suffers abuse when you are shifting gears (unless you're trying to aggressively accelerate of course) you need to rethink what you're doing.

Not saying it's smart but when predictably decelerating on the highway I sometimes shift gears by rev matching and changing without even touching the clutch, for the fun of it.


Same on a motorcycle. My current bike has a quickshifter (for both up and down shifting) but even without one you can shift up and down without ever touching the clutch just by pushing on the shift pedal while blipping the throttle off-and-on (for an upshift) or on-and-off (for a downshift). That’s all the quickshifter is really doing anyway, it’s just quicker at it than you are.

I’d wager bordering on 100% of my clutch use is when coming to a complete stop.


Also to add- the one service that was fast enough on the LLM side was Cerebras. The time to first token (ttft) is incredibly fast (200-300ms) and the t/s is 2000t/s for 8B- combined making for a great conversational experience.


Oh no- what issue were you facing with Hassaan bot? It might just have been getting the hug of death. Hope you can try again!


If you use the demo on the website you don’t have to enter an email, tavus.io


Thanks friend! Great to hear- let us know how we can help in any way :)


This is definitely a good idea! I think the hard part is making it contextual and relevant to the last question/response, in which case the LLM comes into the equation again. Something we're looking at though!


Perhaps use a small, fast LLM to maintain a rolling "disposition" state, and for each of perhaps a handful of dispositions, have a handful of bridging emotes/gestures. You can have the small LLM use the next-to-last/second-most-recent user input to control the disposition async'ly, and in moments where it's not clear just say "That's a good question," "Let me think about that," or "I think that..." etc.


This is good feedback. We have a base subscription fee to cover ongoing costs of maintaining the models/replicas you create and other elements, otherwise it's all on-demand.


Sorry about that friends- we had a hug of death event. Hope you can try again!


Haha great question- we're really passionate about the conversational video interface, and our goal is to make it /incredibly/ good, so we're going to continue to do research and release new models that accomplish this. There's so much to do in the pursuit of that.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: