Main Article Content
A forensic methodology is presented for examination huge size of log records to extricate information which can help advanced specialists and inspectors during the examination of cloud based crimes that happened through a specific time. In this methodology, we utilized Apache Hadoop and Apache Spark for investigation web log information. Apache Hadoop for examination of log information is utilized while an Apache Spark is utilized to give bunch and constant investigation of 200 web worker log information. In each approach, three unique projects are carried out and tried on three distinctive log documents in size. Each program extricates the diverse kind of data that can help advanced examiners in remaking timetable related crimes that are happened. The outcomes show that Apache Hadoop and Apache Spark can be utilized as quick stages for preparing different enormous size of log documents and concentrate valuable data that can uphold computerized examiners in investigation monstrous measure of cloud log information in a given edge time just as remade course of events identified with occurrences. Besides, the outcomes can arrangement to remake and produce a timetable identified with verifiable past grouping occasions happened during a crime just as distinguish the malignant client's IP address, date and time, with the quantity of access.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.