-
Nothing 'dedicated', and old is upgraded to new (ykwim)
-
Follows (1), but for TIH, we have a couple of A100s (around 10)
-
Mostly A100s (8-10 maybe), dunno about H100 or A200
-
Reach out to Profs with an appropriate reason for usage (they'll share the next steps)
-
No, but you can get cheap GPU infra for really cheap (using Groq, etc.)
-
To some extent, yes, but I'd say it's slow but steady.
Neel
u/Neilblaze
Feed options
Hot
New
Top
View
Card
Compact
Interesting! I wanted to know if we can collaborate on some work and work towards publishing that in a A* conf (like SIGCHI or atleast something close). I'm currently doing my MS in CS, and my background is in NLP, Explainable AI, HAI, LLM Evals, and Product Engineering.
Lmk if we can. If yes, just respond, I'll dm.