Domain Knowledge Integration and Distillation using text Mining

Authors

  • Shashi Pal Singh, Ritu Tiwari, Sanjeev Sharma, Aastha Porwa, Shrishti Choudhary

Keywords:

Integration ,Distillation , Data Preprocessing , NLP, Confusion Matrix , Classification.

Abstract

Imagine a media organization striving to combat the spread of misinformation during an election season. The ability to synthesize and simplify diverse knowledge sources becomes crucial for accurately predicting and identifying fake news. In today's dynamic world, this need extends across all disciplines. This paper explores two fundamental processes essential for achieving such synthesis: domain knowledge integration, which harmonizes information from various sources, and distillation, which extracts essential insights. Through a comprehensive literature review and detailed case studies, we examine the methodologies, challenges, and benefits associated with these processes. Embracing domain knowledge integration and distillation enables organizations to streamline operations, enhance strategic planning, and gain a competitive edge. The results reveal significant improvements in fake news detection accuracy, streamlined operations, and increased stakeholder confidence.

Moreover, the application of these processes in real-world scenarios underscores  their practical value. By leveraging these methodologies, media organizations can better allocate resources, respond more quickly to emerging threats, and maintain public trust. This research thus provides a robust framework for other sectors facing similar challenges, highlighting the universal applicability of these techniques. The insights gained from this study are poised to influence future strategies in combating misinformation, promoting a more informed and resilient society. This research significantly contributes to understanding these methodologies, ensuring originality and unwavering integrity in its findings.

Downloads

Download data is not yet available.

References

Xu Chen, Yongfeng Zhang, and Zheng Qin. 2018. Adversarial Distillation for Efficient Recommendation with External Knowledge. ACM Journals. 37:35-54.

Yuzhu Wang and Lechao Cheng. 2023. Improving Knowledge Distillation via Regularizing Feature Norm and Direction

Himel Das Gupta .2022. A RoadMap to Domain Knowledge Integration in Machine Learning.

Mertins, K., & Jochem, R. (2006). Integrating knowledge management and enterprise resource planning

Zhang, X., & Zhao, J. (2018). Knowledge graph embedding: A survey of approaches and applications.

Bonomi, L., Herrera, P., & Micchelli, C. A. (2012). Distillation of knowledge from multiple information sources. Applied and Computational Harmonic Analysis

Tsai, C. H., Chiang, J. C., & Lin, C. H. (2015). Extracting and visualizing the knowledge map of educational big data

Downloads

Published

06.08.2024

How to Cite

Shashi Pal Singh. (2024). Domain Knowledge Integration and Distillation using text Mining. International Journal of Intelligent Systems and Applications in Engineering, 12(23s), 367 –. Retrieved from https://ijisae.org/index.php/IJISAE/article/view/6848

Issue

Section

Research Article