Interacting with multi-task huge models via API???
Normally huge models don't make a commercial sense for 1 company but they do when you serve them via reserved GPU cloud instances and the load is spreaded uniformly via users in different time-zones.
A wide logistic network doesnโt make sense for a small company but does when used by many.
General robust NLP APIs are just full-blown exploitation of scalability.
Releasing internal tools for anyone to use is a proven strategy.
Companies save huge time and money as they don't require data scientists and project gestation periods reduce by many folds.
The biggest troubles of some companies are solved in an instance
No hiring trouble
No data acquisition and cleaning trouble
No project risks of modelling
No deployment delays
No scalability issues
Yes, there are nuances that general APIs cannot take care. But is there a 80:20 in this game?
Share this post
NLProc End Game?
Share this post
๐ช๐ต๐ฎ๐ ๐ถ๐ ๐๐ต๐ฒ ๐ณ๐๐๐๐ฟ๐ฒ ๐ผ๐ณ ๐ก๐๐ฃโ
Interacting with multi-task huge models via API???
Normally huge models don't make a commercial sense for 1 company but they do when you serve them via reserved GPU cloud instances and the load is spreaded uniformly via users in different time-zones.
Companies save huge time and money as they don't require data scientists and project gestation periods reduce by many folds.
The biggest troubles of some companies are solved in an instance
No hiring trouble
No data acquisition and cleaning trouble
No project risks of modelling
No deployment delays
No scalability issues
Yes, there are nuances that general APIs cannot take care. But is there a 80:20 in this game?
Haptik releases demo
https://tech.conversation.ai/demo
OpenAI releases APIs
https://beta.openai.com
Are we starting the consolidation phase of NLP?
Are we reaching the end game?
Share Pratikโs Pakodas ๐ฟ