Hi Chenjie,

This is the logs. At UTC 2019:02:04 I restarted the VM. In openstack.log I found some error messages, I don’t know if it’s  relevant.

2019-07-19 02:04:16.141 186477 INFO eventlet.wsgi.server [req-6600ca74-1f93-4e54-88c1-35f964f1e055 2798eb7d8ca94c3eb4c134eb47bca7ea cea798d27ac44ca8b871877fd2adfeea default - -] 127.168.204.3,127.168.204.3 "GET /v1/alarms/summary HTTP/1.1" status: 200  len: 306 time: 0.0178909
2019-07-19 02:04:21.077 243655 ERROR neutron.agent.linux.async_process [-] Error received from [ovsdb-client monitor tcp:127.0.0.1:6639 Interface name,ofport,external_ids --format=json]: ovsdb-client: tcp:127.0.0.1:6639: receive failed (End of file)
2019-07-19 02:04:21.077 243655 ERROR neutron.agent.linux.async_process [-] Process [ovsdb-client monitor tcp:127.0.0.1:6639 Interface name,ofport,external_ids --format=json] dies due to the error: ovsdb-client: tcp:127.0.0.1:6639: receive failed (End of file)
2019-07-19 02:04:22.077 243655 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6639: send error: Connection refused
2019-07-19 02:04:22.079 243655 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6639: connection dropped (Connection refused)
2019-07-19 02:04:22.089 243737 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6639: send error: Connection refused
2019-07-19 02:04:22.090 243737 WARNING ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6639: connection dropped (Connection refused)
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.ofswitch [req-8c7be001-d68e-4d45-929e-50017d044bb0 - - - - -] ofctl request version=0x4,msg_type=0x12,msg_len=0x38,xid=0xb04757ee,OFPFlowStatsRequest(cookie=0,cookie_mask=0,flags=0,match=OFPMatch(oxm_fields={}),out_group=4294967295,out_port=4294967295,table_id=23,type=1) timed out: Timeout: 10 seconds
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int [req-8c7be001-d68e-4d45-929e-50017d044bb0 - - - - -] Failed to communicate with the switch: RuntimeError: ofctl request version=0x4,msg_type=0x12,msg_len=0x38,xid=0xb04757ee,OFPFlowStatsRequest(cookie=0,cookie_mask=0,flags=0,match=OFPMatch(oxm_fields={}),out_group=4294967295,out_port=4294967295,table_id=23,type=1) timed out
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int Traceback (most recent call last):
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int   File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/br_int.py", line 52, in check_canary_table
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int     flows = self.dump_flows(constants.CANARY_TABLE)
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int   File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 147, in dump_flows
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int     reply_multi=True)
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int   File "/usr/lib/python2.7/site-packages/neutron/plugins/ml2/drivers/openvswitch/agent/openflow/native/ofswitch.py", line 95, in _send_msg
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int     raise RuntimeError(m)
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int RuntimeError: ofctl request version=0x4,msg_type=0x12,msg_len=0x38,xid=0xb04757ee,OFPFlowStatsRequest(cookie=0,cookie_mask=0,flags=0,match=OFPMatch(oxm_fields={}),out_group=4294967295,out_port=4294967295,table_id=23,type=1) timed out
2019-07-19 02:04:23.171 243655 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.br_int
2019-07-19 02:04:23.188 243655 WARNING neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [req-8c7be001-d68e-4d45-929e-50017d044bb0 - - - - -] OVS is dead. OVSNeutronAgent will keep running and checking OVS status periodically.
2019-07-19 02:04:23.401 186476 INFO eventlet.wsgi.server [req-fb780aa9-92a7-4cee-8195-fa34b5d7b0e0 2798eb7d8ca94c3eb4c134eb47bca7ea cea798d27ac44ca8b871877fd2adfeea default - -] 127.168.204.3,127.168.204.3 "GET /v1/alarms/summary HTTP/1.1" status: 200  len: 306 time: 0.0467389
2019-07-19 02:04:24.190 243655 WARNING neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [req-8c7be001-d68e-4d45-929e-50017d044bb0 - - - - -] OVS is restarted. OVSNeutronAgent will reset bridges and recover ports.
2019-07-19 02:04:24.242 243655 INFO neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [req-8c7be001-d68e-4d45-929e-50017d044bb0 - - - - -] Mapping physical network providernet-a to bridge br-phy0
2019-07-19 02:04:24.295 243655 INFO neutron.plugins.ml2.drivers.openvswitch.agent.openflow.native.ovs_bridge [req-8c7be001-d68e-4d45-929e-50017d044bb0 - - - - -] Bridge br-phy0 has datapath-ID 0000f8f21e640120