Hi Steven. Thanks for sharing your cyber security expertise & knowledge with the community on your TH-cam channel. Best wishes & continue your inspiring cyber security training & work.
Network analysis relies on having pcap data available. You'd need to capture pcap data 24/7, right? What tool do you recommend to capture pcap data of that magnitude? If opensource, would it be recommended to deploy in a production environment?
Great insight and question! You’re correct that network analysis relies having pcap data but you can actually get away with netflow/NGFW log data without having actual PCAPs and that is how many organizations are setup. To capture pcaps at scale, you would use a network packer aggregator/indexer such as gigamon or Arkime (open source route) with taps setup but please note, arkime will require a beefy machine to be used in a production environment.
Something is off with Task 5. Although 56 username/password pairs was the correct answer, there was one "username" text to be removed resulting in 55. `cat u.txt| cut -d '=' -f 2 | cut -d '%' -f 1 | sort | uniq | grep -v username | wc -l`
Piece of Art 🎨.
Hi Steven. Thanks for sharing your cyber security expertise & knowledge with the community on your TH-cam channel.
Best wishes & continue your inspiring cyber security training & work.
My pleasure!
Awesome stuff ❤🎉🎉
Couldn’t you have opened the log file in splunk to get the data in the format easier
You would need to download apps in Splunk to ingest PCAPs and/or use a tool to parse the PCAP such as Zeek so Splunk can read it
🤝
Network analysis relies on having pcap data available. You'd need to capture pcap data 24/7, right? What tool do you recommend to capture pcap data of that magnitude? If opensource, would it be recommended to deploy in a production environment?
Great insight and question! You’re correct that network analysis relies having pcap data but you can actually get away with netflow/NGFW log data without having actual PCAPs and that is how many organizations are setup. To capture pcaps at scale, you would use a network packer aggregator/indexer such as gigamon or Arkime (open source route) with taps setup but please note, arkime will require a beefy machine to be used in a production environment.
Something is off with Task 5. Although 56 username/password pairs was the correct answer, there was one "username" text to be removed resulting in 55. `cat u.txt| cut -d '=' -f 2 | cut -d '%' -f 1 | sort | uniq | grep -v username | wc -l`