With recent advances in networking technology, emerging networks continue to play an increasing role in the lives of most users. The Internet search and retrieval system is so powerful that it helps us to share information and perspectives from across the world. However, the threat of censorship exists on some centralized search engines, since all of their information is currently controlled by these sites administrators. The restriction and control of information are pervasive enough within governments and organizations to censor or intrude on even the most free and uncontrolled communication media. For this reason, the Peer-to-Peer (P2P) search and retrieval system is designed to resist censorship over the network. Nevertheless, its decentralized nature makes it very difficult to infer information that cannot be measured directly, such as the proportion of subverted and selfish nodes. Moreover, the situation is even more challenging when the network becomes extremely large. Hence, I propose a dynamic adaptive algorithm that can: 1) tackle the censorship and security issues; 2) determine the proportion of subverted and selfish nodes; 3) defend against malicious and selective forwarding attacks by appropriately adjusting the number of requests to ensure high match probability; 4) guarantee robustness and scalability even with different random networks and varied network sizes. In several experiments, I demonstrate that my algorithm can effectively and accurately estimate these metrics and manage the system, even when the network has a large proportion of malicious nodes, a large proportion of selfish nodes, or a mere partial view of network membership.
- Distributed systems
- Message forwarding
- Peer-to-Peer (P2P) search and retrieval
- Probabilistic analysis
- Probability Density Function (pdf)
- Random networks