Skip to main content

r/mxnet

members
online

temporal segment network load increases on inference temporal segment network load increases on inference
temporal segment network load increases on inference

I tried inferencing on a pretrained TSN model for Action recognition from Gluon zoo. On inferencing the first few frames the CPU consumption was lower, but it gradually increased on inferencing on later frames

#gluonCV #actionrecognition #TSN


Accessing Mxnet MMS from external machine Accessing Mxnet MMS from external machine

Hello all,

I am trying to use Mxnet MMS (multi-model-server) to make my Mxnet model accessible by my website. The Mxnet MMS can run without problem and I can use the served model on the same machine. However, I want to access the served model from another server that are connected each other through the LAN. In order to test the connection, I run the following command from the terminal of Server A (assume that Mxnet MMS works on Server B). When I try to access it, I get an error.

curl -X POST http://127.0.0.1:8080/predictions/hand -T image.jpg

Although the above command runs without problem when I try to run this command in the same server, it does not work from the another server. In order to perform this, how could I do to access the served model on another server?

I also tried to use IP address instead of a localhost. However, it gives below error:

curl: (7) Failed to connect to $IP_address$ port 8080: Connection refused

Could anyone help?

Thanks and best.