In other linux OS the output from ping includes the standard deviation, in OSMC this is not the case, is there any workaround other than making a script to calculate it?
Output pasted for argument sake:
ping -c3 www.google.com
PING www.google.com (82.147.54.26): 56 data bytes
64 bytes from 82.147.54.26: seq=0 ttl=59 time=14.920 ms
64 bytes from 82.147.54.26: seq=1 ttl=59 time=17.310 ms
64 bytes from 82.147.54.26: seq=2 ttl=59 time=14.451 ms
— www.google.com ping statistics —
3 packets transmitted, 3 packets received, 0% packet loss
round-trip min/avg/max = 14.451/15.560/17.310 ms