❌

Reading view

There are new articles available, click to refresh the page.

Infosec Tools

A list of information security tools I use for assessments, investigations and other cybersecurity tasks.

Also worth checking out is CISA’s list of free cybersecurity services and tools.

Jump to Section


OSINT / Reconnaissance

Network Tools (IP, DNS, WHOIS)

Breaches, Incidents & Leaks

FININT (Financial Intelligence)

  • GSA eLibrary - Source for the latest GSA contract award information

GEOINT (Geographical Intelligence)

HUMINT (Human & Corporate Intelligence)

  • No-Nonsense Intel - List of keywords which you can use to screen for adverse media, military links, political connections, sources of wealth, asset tracing etc
  • CheckUser - Check desired usernames across social network sites
  • CorporationWiki - Find and explore relationships between people and companies
  • Crunchbase - Discover innovative companies and the people behind them
  • Find Email - Find email addresses from any company
  • Info Sniper - Search property owners, deeds & more
  • Library of Leaks - Search documents, companies and people
  • LittleSis - Who-knows-who at the heights of business and government
  • NAMINT - Shows possible name and login search patterns
  • OpenCorporates - Legal-entity database
  • That’s Them - Find addresses, phones, emails and much more
  • TruePeopleSearch - People search service
  • WhatsMyName - Enumerate usernames across many websites
  • Whitepages - Find people, contact info & background checks

IMINT (Imagery/Maps Intelligence)

MASINT (Measurement and Signature Intelligence)

SOCMINT (Social Media Intelligence)

Email

Code Search

  • grep.app - Search across a half million git repos
  • PublicWWW - Find any alphanumeric snippet, signature or keyword in the web pages HTML, JS and CSS code
  • searchcode - Search 75 billion lines of code from 40 million projects

Scanning / Enumeration / Attack Surface


Offensive Security

Exploits

  • Bug Bounty Hunting Search Engine - Search for writeups, payloads, bug bounty tips, and more…
  • BugBounty.zip - Your all-in-one solution for domain operations
  • CP-R Evasion Techniques
  • CVExploits - Comprehensive database for CVE exploits
  • DROPS - Dynamic CheatSheet/Command Generator
  • Exploit Notes - Hacking techniques and tools for penetration testings, bug bounty, CTFs
  • ExploitDB - Huge repository of exploits from Offensive Security
  • files.ninja - Upload any file and find similar files
  • Google Hacking Database (GHDB) - A list of Google search queries used in the OSINT phase of penetration testing
  • GTFOArgs - Curated list of Unix binaries that can be manipulated for argument injection
  • GTFOBins - Curated list of Unix binaries that can be used to bypass local security restrictions in misconfigured systems
  • Hijack Libs - Curated list of DLL Hijacking candidates
  • Living Off the Living Off the Land - A great collection of resources to thrive off the land
  • Living Off the Pipeline - CI/CD lolbin
  • Living Off Trusted Sites (LOTS) Project - Repository of popular, legitimate domains that can be used to conduct phishing, C2, exfiltration & tool downloading while evading detection
  • LOFLCAB - Living off the Foreign Land Cmdlets and Binaries
  • LoFP - Living off the False Positive
  • LOLBAS - Curated list of Windows binaries that can be used to bypass local security restrictions in misconfigured systems
  • LOLC2 - Collection of C2 frameworks that leverage legitimate services to evade detection
  • LOLESXi - Living Off The Land ESXi
  • LOLOL - A great collection of resources to thrive off the land
  • LOLRMM - Remote Monitoring and Management (RMM) tools that could potentially be abused by threat actors
  • LOOBins - Living Off the Orchard: macOS Binaries (LOOBins) is designed to provide detailed information on various built-in macOS binaries and how they can be used by threat actors for malicious purposes
  • LOTTunnels - Living Off The Tunnels
  • Microsoft Patch Tuesday Countdown
  • offsec.tools - A vast collection of security tools
  • Shodan Exploits
  • SPLOITUS - Exploit search database
  • VulnCheck XDB - An index of exploit proof of concept code in git repositories
  • XSSed - Information on and an archive of Cross-Site-Scripting (XSS) attacks

Red Team

  • ArgFuscator - Generates obfuscated command lines for common system tools
  • ARTToolkit - Interactive cheat sheet, containing a useful list of offensive security tools and their respective commands/payloads, to be used in red teaming exercises
  • Atomic Red Team - A library of simple, focused tests mapped to the MITRE ATT&CK matrix
  • C2 Matrix - Select the best C2 framework for your needs based on your adversary emulation plan and the target environment
  • ExpiredDomains.net - Expired domain name search engine
  • Living Off The Land Drivers - Curated list of Windows drivers used by adversaries to bypass security controls and carry out attacks
  • Unprotect Project - Search Evasion Techniques
  • WADComs - Curated list of offensive security tools and their respective commands, to be used against Windows/AD environments

Web Security

  • Invisible JavaScript - Execute invisible JavaScript by abusing Hangul filler characters
  • INVISIBLE.js - A super compact (116-byte) bootstrap that hides JavaScript using a Proxy trap to run code

Security Advisories

  • CISA Alerts - Providing information on current security issues, vulnerabilities and exploits
  • ICS Advisory Project - DHS CISA ICS Advisories data visualized as a Dashboard and in Comma Separated Value (CSV) format to support vulnerability analysis for the OT/ICS community

Attack Libraries

A more comprehensive list of Attack Libraries can be found here.

  • ATLAS - Adversarial Threat Landscape for Artificial-Intelligence Systems is a knowledge base of adversary tactics and techniques based on real-world attack observations and realistic demonstrations from AI red teams and security groups
  • ATT&CK
  • Risk Explorer for Software Supply Chains - A taxonomy of known attacks and techniques to inject malicious code into open-source software projects.

Vulnerability Catalogs & Tools

Risk Assessment Models

A more comprehensive list of Risk Assessment Models and tools can be found here.


Blue Team

CTI & IoCs

  • Alien Vault OTX - Open threat intelligence community
  • BAD GUIDs EXPLORER
  • Binary Edge - Real-time threat intelligence streams
  • Cloud Threat Landscape - A comprehensive threat intelligence database of cloud security incidents, actors, tools and techniques. Powered by Wiz Research
  • CTI AI Toolbox - AI-assisted CTI tooling
  • CTI.fyi - Content shamelessly scraped from ransomwatch
  • CyberOwl - Stay informed on the latest cyber threats
  • Dangerous Domains - Curated list of malicious domains
  • HudsonRock Threat Intelligence Tools - Cybercrime intelligence tools
  • InQuest Labs - Indicator Lookup
  • IOCParser - Extract Indicators of Compromise (IOCs) from different data sources
  • Malpuse - Scan, Track, Secure: Proactive C&C Infrastructure Monitoring Across the Web
  • ORKL - Library of collective past achievements in the realm of CTI reporting.
  • Pivot Atlas - Educational pivoting handbook for cyber threat intelligence analysts
  • Pulsedive - Threat intelligence
  • ThreatBook TI - Search for IP address, domain
  • threatfeeds.io - Free and open-source threat intelligence feeds
  • ThreatMiner - Data mining for threat intelligence
  • TrailDiscover - Repository of CloudTrail events with detailed descriptions, MITRE ATT&CK insights, real-world incidents references, other research references and security implications
  • URLAbuse - Open URL abuse blacklist feed
  • urlquery.net - Free URL scanner that performs analysis for web-based malware

URL Analysis

Static / File Analysis

  • badfiles - Enumerate bad, malicious, or potentially dangerous file extensions
  • CyberChef - The cyber swiss army knife
  • DocGuard - Static scanner and has brought a unique perspective to static analysis, structural analysis
  • dogbolt.org - Decompiler Explorer
  • EchoTrail - Threat hunting resource used to search for a Windows filename or hash
  • filescan.io - File and URL scanning to identify IOCs
  • filesec.io - Latest file extensions being used by attackers
  • Kaspersky TIP
  • Manalyzer - Static analysis on PE executables to detect undesirable behavior
  • PolySwarm - Scan Files or URLs for threats
  • VirusTotal - Analyze suspicious files and URLs to detect malware

Dynamic / Malware Analysis

Forensics

  • DFIQ - Digital Forensics Investigative Questions and the approaches to answering them

Phishing / Email Security


Assembly / Reverse Engineering


OS / Scripting / Programming

Regex


Password


AI

  • OWASP AI Exchange - Comprehensive guidance and alignment on how to protect AI against security threats

Assorted

OpSec / Privacy

Jobs

  • infosec-jobs - Find awesome jobs and talents in InfoSec / Cybersecurity

Conferences / Meetups

Infosec / Cybersecurity Research & Blogs

Funny

Walls of Shame

  • Audit Logs Wall of Shame - A list of vendors that don’t prioritize high-quality, widely-available audit logs for security and operations teams
  • Dumb Password Rules - A compilation of sites with dumb password rules
  • The SSO Wall of Shame - A list of vendors that treat single sign-on as a luxury feature, not a core security requirement
  • ssotax.org - A list of vendors that have SSO locked up in an subscription tier that is more than 10% more expensive than the standard price
  • Why No IPv6? - Wall of shame for IPv6 support

Other

Dynamization of Jekyll

Jekyll is a framework for creating websites/blogs using static plain-text files. Jekyll is used by GitHub Pages, which is also the current hosting provider for Shellsharks.com. I’ve been using Git Pages since the inception of my site and for the most part have no complaints. With that said, a purely static site has some limitations in terms of the types of content one can publish/expose.

I recently got the idea to create a dashboard-like page which could display interesting quantitative data points (and other information) related to the site. Examples of these statistic include, total number of posts, the age of my site, when my blog was last updated, overall word count across all posts, etc… Out of the box, Jekyll is limited in its ability to generate this information in a dynamic fashion. The Jekyll-infused GitHub pages engine generates the site via an inherent pages-build-deployment Git Action (more on this later) upon commit. The site will then stay static until the next build. As such, it has limited native ability to update content in-between builds/manual-commits.

To solve for this issue, I’ve started using a variety of techniques/technologies (listed below) to introduce more dynamic functionality to my site (and more specificially, the aforementioned statboard).

Jekyll Liquid

Though not truly β€œdynamic”, Liquid* templating language is an easy, Jekyll-native way to generate static content in a quasi-dynamic way at site build time. As an example, if I wanted to denote the exact date and time that a blog post was published I might first try to use the Liquid template {{ site.time }}. What this actually ends up giving me is a time stamp for when the site was built (e.g. 2025-05-04 14:04:30 -0400), rather than the last updated date of the post itself. So instead, I can harness the posts custom front matter, such as β€œupdated:”, and access that value using the tag {{ page.updated }} (so we get, __).

One component on the (existing) Shellsharks statboard calculates the age of the site using the last updated date of the site (maintained in the change log), minus the publish date of the first-ever Shellsharks post. Since a static, Jekyll-based, GitHub Pages site is only built (and thus only updated) when I actually physically commit an update, this component will be out of date if I do not commit atleast daily. So how did I solve for this? Enter Git Actions.

* Learn more about the tags, filters and other capabilities of Liquid here.

JavaScript & jQuery

Before we dive into the power of Git Actions, it’s worth mentioning the ability to add dynamism by simply dropping straight up, in-line JavaScript directly into the page/post Markdown (.md) files. Remember, Jekyll produces .html files directly from static, text-based files (like Markdown). So the inclusion of raw JS syntax will translate into embdedded, executable JS code in the final, generated HTML files. The usual rules for in-page JS apply here.

One component idea I had for the statboard was to have a counter of named vulnerabilities. So how could I grab that value from the page? At first, I tried fetching the DOM element with the id in which the count was exposed. However this failed because fetching that element alone meant not fetching the JS and other HTML content that was used to actually generate that count. To solve for this, I used jQuery to load the entire page into a temporary <div> tag, then iterated through the list (<li>) elements within that div (similar to how I calculate it on the origin page), and then finally set the dashboard component to the calculated count!

$('<div></div>').load('/infosec-blogs', function () {
  var blogs = $("li",this).length;
  $("#iblogs").html(blogs);
});
Additional notes on the use of JS and jQuery
  • I used Google’s Hosted Libraries to reference jQuery <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.6.0/jquery.min.js"></script>.
  • Be weary of adding JS comments // in Markdown files as I noticed the Jekyll parsing engine doesn’t do a great job of new-lining, and thus everything after a comment will end up being commented.
  • When using Liquid tags in in-line JS, ensure quotes (β€˜β€™,””) are added around the templates so that the JS code will recognize those values as strings (where applicable).
  • The ability to add raw, arbitrary JS means there is a lot of untapped capability to add dynamic content to an otherwise static page. Keep in mind though that JS code is client-side, so you are still limited in that typical server-side functionality is not available in this context.

Git Actions

Thanks to the scenario I detailed in the Jekyll Liquid section, I was introduced to the world of Git Actions. Essentially, I needed a way to force an update / regeneration of my site such that one of my staticly generated Liquid tags would update at some minimum frequency (in this case, at least daily). After some Googling, I came across this action which allowed me to do just that! Essentially, it forces a blank build using a user-defined schedule as the trigger.

# File: .github/workflows/refresh.yml
name: Refresh

on:
  schedule:
    - cron:  '0 3 * * *' # Runs every day at 3am

jobs:
  refresh:
    runs-on: ubuntu-latest
    steps:
      - name: Trigger GitHub pages rebuild
        run: |
          curl --fail --request POST \
            --url https://api.github.com/repos/${{ github.repository }}/pages/builds \
            --header "Authorization: Bearer $USER_TOKEN"
        env:
          # You must create a personal token with repo access as GitHub does
          # not yet support server-to-server page builds.
          USER_TOKEN: ${{ secrets.USER_TOKEN }}

In order to get this Action going, follow these steps…

  1. Log into your GitHub account and go to Settings (in the top right) –> Developer settings –> Personal access tokens.
  2. Generate new token and give it full repo access scope (More on OAuth scopes). I set mine to never expire, but you can choose what works best for you.
  3. Navigate to your GitHub Pages site repo, ***.github.io –> Settings –> Secrets –> Actions section. Here you can add a New repository secret where you give it a unique name and set the value to the personal access token generated earlier.
  4. In the root of your local site repository, create a .github/workflows/ folder (if one doesn’t already exist).
  5. Create a <name of your choice>.yml file where you will have the actual Action code (like what was provided above).
  6. Commit this Action file and you should be able to see run details in your repo –> Actions section within GitHub.
Additional Considerations for GitHub Actions
  • When using the Liquid tag {{ site.time }} with a Git Action triggered build, understand that it will use the time of the server which is generating the HTML, in this case the GitHub servers themselves, which means the date will be in UTC (Conversion help).
  • Check out this reference for informaton on how to specify the time zone in the front matter of a page or within the Jekyll config file.
  • GitHub Actions are awesome and powerful, but their are limitations to be aware of. Notably, it is important to understand the billing considerations. Free tier accounts get 2,000 minutes/month while Pro tier accounts (priced at about $44/user/year) get 3,000.
  • For reference, the refresh action (provided above) was running (for me) at about 13 seconds per trigger. This means you could run that action over 9,000 times without exceeding the minute cap for a Free-tier account.
  • With the above said, also consider that the default pages-build-deployment Action used by GitHub Pages to actually generate and deploy your site upon commit will also consume those allocated minutes. Upon looking at my Actions pane, I am seeing about 1m run-times for each build-and-deploy action trigger.

What’s Next

I’ve only just started to scratch the surface of how I can further extend and dynamize my Jekyll-based site. In future updates to this guide (or in future posts), I plan to cover more advanced GitHub Action capabilities as well as how else to add server-side functionality (maybe through serverless!) to the site. Stay tuned!

Building a retro video library with Jellyfin

By: basic

I started building my Jellyfin server with content from my own physical collection. I was using Channels, the tv watching software, but didn't like how limited it was for editing metadata sources and images and the overall experience. Though it was nice to keep all my watching in one app, Channels only does video and I wanted to add music, podcasts and audiobooks to my collection. We also canceled many streaming service subscriptions and decided to keep only a couple and go month to month on others as we found series to catch up on as needed.

Once everything appeared fine and working, I started thinking about adding retro media such as Saturday Morning Cartoons and Anime from the 70s & 80s. I first came up with idea after watching clips on YouTube and Archive.org. I was able to find older episodes of anime shows I used to watch as a kid before walking to school. These were Robotech and Star Blazers. I never realized how old the series Star Blazers was until I read 1974! Guess it was syndicated already in the 80s when I started watching.

Issues

I ran into trouble with scanning new media from Jellyfin. I thought I had the folder structure correct but it seems Jellyfin doesn't like too many sub-folders. After an hour of pulling my hair from reading forums and docs, I figured this out on my own. Pulled out the Anime into it's own folder for shows only since movies were ok. I recently added Akira (1988) and it detected easily within Jellyfin. I have a separate library for Anime Movies and Anime Shows now. Each scanning a separate folder making it easier for me to manage.

Plugins

Jellyfin has a nice library of plugins. I added the AnimDB for extra data in case TMDB didn't include all the information required for these older series. TMDB so far is very good which is built in to the default Jellyfin install.

Podcasts

Back in the early days of podcasts around 2005, there were vidcasts which were basically small shows. There was Diggnation, Ask A Ninja, Tikibar TV and Totally Rad Show to name a few. I have found some on Archive.org and that made me happy. Back then I had to watch on a computer but today we have various options to watch from our phones, tablets and TV using Jellyfin for example.

Next I will find Saturday Morning Cartoons and create a new section in Jellyfin. The clips I've seen include the original tv commercials we saw as kids too. I plan to have this playing in the background while I work on my remote days at home.

❌