<th id="5nh9l"></th><strike id="5nh9l"></strike><th id="5nh9l"><noframes id="5nh9l"><th id="5nh9l"></th><strike id="5nh9l"></strike>
<progress id="5nh9l"><noframes id="5nh9l"><th id="5nh9l"><noframes id="5nh9l">
<th id="5nh9l"></th> <strike id="5nh9l"><noframes id="5nh9l"><span id="5nh9l"></span>
<progress id="5nh9l"><noframes id="5nh9l"><span id="5nh9l"><noframes id="5nh9l"><span id="5nh9l"></span><strike id="5nh9l"><noframes id="5nh9l"><strike id="5nh9l"></strike>
<span id="5nh9l"><noframes id="5nh9l">
<span id="5nh9l"><noframes id="5nh9l">
<span id="5nh9l"></span><span id="5nh9l"><video id="5nh9l"></video></span>
<th id="5nh9l"><noframes id="5nh9l"><th id="5nh9l"></th>
<progress id="5nh9l"><noframes id="5nh9l">

用戶屬性感知的移動社交網絡邊緣緩存機制

User-aware edge-caching mechanism for mobile social network

  • 摘要: 針對數據流量爆發式增長所引發的網絡擁塞、用戶體驗質量惡化等問題,提出一種用戶屬性感知的邊緣緩存機制。首先,利用隱語義模型獲知用戶對各類內容的興趣度,進而估計本地流行內容,然后微基站將預測的本地流行內容協作緩存,并根據用戶偏好的變化,將之實時更新。為進一步減少傳輸時延,根據用戶偏好構建興趣社區,在興趣社區中基于用戶的緩存意愿和緩存能力,選擇合適的緩存用戶緩存目標內容并分享給普通用戶。結果表明,所提機制性能優于隨機緩存及最流行內容緩存算法,在提高緩存命中率、降低傳輸時延的同時,增強了用戶體驗質量。

     

    Abstract: With the rapid growth in the number of intelligent terminal devices and wireless multimedia applications, mobile communication traffic has exploded. The latest report from Cisco Visual Networking Index (CVNI) indicates that by 2022, global mobile data traffic will have grown to three times that in 2017, which will exert tremendous pressure on the backhaul link. One key approach to solve this problem is to cache popular content at the edges (base stations and mobile devices) and then bring the requested content from the edges close to the user, instead of obtaining the requested content from the content server through backhaul networks. Thus, by obtaining the required content of mobile users locally, edge caching can effectively improve network performance and reduce the pressure on the backhaul link. However, owing to the limited storage capacity of the edge nodes and the diversification of user requirements, the edge nodes can neither cache all the content in the content server nor randomly cache the content. To solve these problems, an edge-caching mechanism based on user-awareness was proposed. First, using an implicit semantic model, we predicted popular content in a macro cell in terms of the users’ interests. Small base stations within identical macro cells cache data cooperatively, which update local popular content based on the dynamic content preference of users. To further reduce the delay in content delivery, we helped users to ascertain their top communities of interest based on their content preferences. At the same time, the most appropriate user equipment (UE) is selected considering the caching willingness and caching ability to cache data for other UEs in identical communities of interest. Results show that the proposed mechanism outperforms the random cache approach and the most popular content-caching algorithm; it improves the cache hit rate and reduces the transmission delay while enhancing the quality of user experience.

     

/

返回文章
返回
<th id="5nh9l"></th><strike id="5nh9l"></strike><th id="5nh9l"><noframes id="5nh9l"><th id="5nh9l"></th><strike id="5nh9l"></strike>
<progress id="5nh9l"><noframes id="5nh9l"><th id="5nh9l"><noframes id="5nh9l">
<th id="5nh9l"></th> <strike id="5nh9l"><noframes id="5nh9l"><span id="5nh9l"></span>
<progress id="5nh9l"><noframes id="5nh9l"><span id="5nh9l"><noframes id="5nh9l"><span id="5nh9l"></span><strike id="5nh9l"><noframes id="5nh9l"><strike id="5nh9l"></strike>
<span id="5nh9l"><noframes id="5nh9l">
<span id="5nh9l"><noframes id="5nh9l">
<span id="5nh9l"></span><span id="5nh9l"><video id="5nh9l"></video></span>
<th id="5nh9l"><noframes id="5nh9l"><th id="5nh9l"></th>
<progress id="5nh9l"><noframes id="5nh9l">
259luxu-164