摘要:For intelligent wide area surveillance, distributed cooperative inference between agents should be provided efficiently. In this paper, efficient cache management model for better context inference based on distributed data infrastructure is investigated. Context aware computing with inference based on ontology is widely used in distributed surveillance environment. However even smart devices, they generally have small memory and power which can manage only part of surveillance data. Higher level inference considering related wide area information can cause huge data transmission. Surveillance data integration merges and aligns ontologies according to the semantic similarity for efficient cooperation. For such a collaborative network, ICN(Information Centric Network) is adopted and an effective cache management scheme that can help the efficient ontology integration. In this paper, we propose an efficient cache management model which is adaptive to the actual device demands. Our scheme shows the efficiency of model resulted in better and efficient context inference.