Skip to main content

Share your search ! (or bug ;) )

· One min read

To simplify sharing information, you may now click on the Share icon at the top right of the GUI.

You may then share this link with anybody having access to this Whisperer. When opening the link in the browser, the GUI will open and display exactly (or almost) what you are currently seeing. Redux's magic ;-)

Sticky detail panel

· One min read

A brand new shiny feature from this week end: sticky details panel.

I noticed that it was difficult to use the Host stats features while changing the time to compare stats. Indeed, the selection in the timeline with located behind the detail panel.

So I added a new feature: the detail panel can be 'sticky' and takes its place on the right of the map and table, and not over them. Just click on the Pin icon of the detail panel icons to do it.

Becomes:

Enjoy :-)

Whisperer creation and installation

· One min read

A new feature has been added last week to allow easy Whisperer creation.

In the whisperer list, you may click on 'New Whisperer'. It:

  • Create a new Whisperer with a default name.
  • Associate it with your account
  • Create a default configuration as an 'UPLOAD' whisperer
  • Offers you to edit the name of the Whisperer in the details panel

You may then edit the Whisperer configuration in Capture and Parsing tabs.

For an INTERFACE Whisperer, the installation instructions are provided in the Installation tab:

This tab also allows you to generate (and change) the API key of the whisperer, used for it to connect to Spider. If lost, it needs to be generated again, as the server does not keep a copy of the private key.

For now, as we are in private beta you need a Gitlab account and authorization for downloading the Docker image to install a Whisperer.

Whisperer remote control, configuration update and monitoring

· 3 min read

Whisperer have seen a huge improvement over the past weeks:

  • They can be remotely started and stopped from the services or the GUI
  • They monitor their configuration changes and their configuration can be updated on the GUI
  • They send the available interfaces on their hosts for configuration
  • They monitor their process and allows health checking and monitoring on the server side
  • First monitoring features have been included in  the GUI

Whisperer have now 3 distinct modes:

  • INTERFACE: They are remote whisperers, installed on a host, that capture network traffic in real time
  • UPLOAD: They are whisperers dedicated to pcap uploading in Spider GUI
  • FILE: They are 'tests' whisperers, that can parse a pcap file on a host and send the file to Spider.

Remote control

The GUI displays the status of an 'INTERFACE' whisperer in the top left corner

 

 

In the order:

  1. Not visible: Whisperer is not started, or not communicating
  2. Connecting: Whisperer is starting
  3. Stopped: Whisperer is started, available, but not capturing
  4. Capturing: Whisperer is capturing data
  5. Wrong configuration: Whisperer configuration is not correct

The control of Whisperer status is made on the Whisperer detail view with the button to START/STOP CAPTURE.

Remote configuration update

Whisperer configurations can now be changed on the GUI.

The Capture Config tab sets configuration on the Whisperer sniffing agent, and the Parsing Config tab sets the configuration for the parsing on the server.

 

 

  • Help is given by clicking on the small (i) icon
  • Valid network interfaces on the Whisperer side are provided for help
  • Value correctness is checked (when possible) on the GUI side

A color code is used to display the value status:

  • Blue: this is default value, it is not specifically set for the Whisperer and comes from server default
  • Orange: modification in progress
  • Green: valid change
  • Black: value is specifically set for the whisperer
  • Red: value is not correct. An help message is provided in tooltip of the error icon.

Remote monitoring

The status of the Whisperer is send regularly to Spider.

The Whisperer details view first tab shows a summary of information:

  • Cpu usage of Whisperer on host, average since last update
  • Memory usage, instantaneous
  • Time of capture start
  • Speed of capture
  • Total of uploaded data
  • Speed of API calls to back office
  • Total of all times for this whisperer

New fields in Http Communications

· One min read

Hi,

To answer client identification problematics, I've performed a few small updates in Spider.

Now the Http Communication resource includes 2 new properties:

  • stats.src.origin: the original IP address from the client. Stored as an IP in ES, so queryable like this: stats.src.origin:"10.1.22.0/16"
  • stats.src.identification: the identified client. Extracted by:
    • The login from Basic Auth identification
    • The sub field from JWT

Both fields can be queried and aggregated upon. Both are accessible in grid (you need to disconnect/reconnect to refresh grids available columns), and in detail view.

What's more, 2 other fields have been added in grid: the response date and the x-forwarded-for request header.

Enjoy!

A distributed system needs an integrated monitoring.

· One min read

Elasticsearch, Redis, Polling Queues, Circuit breakers, REST microservices...

Got an error, a time out in logs? Who generated it, what was the load, how was the system behaving at that time? How did it recover?

If you don't have integrated monitoring, you're driving blind!

I was starting to feel this way. I dedicated this week work to develop a new app in the system that monitors all others. Other applications are existing circuit breakers stats, queue stats, and soon REST stats, and I access ElasticSearch stats and Redis stats through their APIs.

All is collected continuously,  sent to another ES index, and Kibana helped me build easy and fast reports on spider health and speed through time. Really easy when you got all your pieces already in place and easily extendable!

Architecture upgrade

· One min read

Yesterday,  Spider encountered a major architecture upgrade:

  • I moved to one index by resource type
    • To get ready for ES 6
    • To improve shards balance in the cluster
    • To reduce IO access because smaller volume resources are separated from big ones
    • To improve ES aggregations speed
  • I introduced a poller between Redis and ElasticSearch for the four main resources
    • The load constraint is now only focused on two microservices and only on Redis
    • The load on ES is smoothed
  • I added some in memory cache on Whisperer configuration access from microservices to reduce drastically unnecessary calls
  • I have now 20 microservices in my cluster, with many being multi instantiated 😎

Streesmart instance has been upgraded with a complete data purge (sorry).

Let's look how it behaves. It should be much more stable!

Move to cluster.

· One min read

To be able to capture more environments, Spider has moved to Cluster mode:

  • Docker Swarm using Docker stacks
  • ElasticSearch in cluster

Next step will be Redis architecture: ES is not fast enough in indexing.

So I'll put all input load to Redis only, and decouple performance needs by adding synchronisations pollers. Like we do on Streetsmart. I thought I could live without for longer using bulks... But, if I don't want to increase the hardware, I need to improve the architecture!

Spider released for StreetSmart team!

· One min read

First 'official' disclosure of Spider. I'm eager to get feedback, good or bad!

I've been working on Spider since December 2015. That's a reboot of a project I did in... guess when... 2003!

Back then, it was called RAPID Spy, because it was meant to spy and analyse RAPID architecture that was a trademark of Reuters Financial Software. It was used for analysis of functional test results and of technical and performance tests. But in fact, it revealed more powerful than that. RAPID was essentially Web based architecture. And if Web Oriented Architecture was still uncommon in 2003, I've nevertheless used RAPID Spy in all my projects since that time. It even help me decode a prioritary protocol of a COTS software we setup for one of our customers in 2008. I even built performance tests out of it.

In 2003, I bought back RAPID Spy rights from Reuters. I was sure I would reuse it, and they wouldn't do anything with it.

From RAPID Spy to Spider, the base concept is the same, but concerning architecture, technical expertise, evolutivity, scalability, there is a world! 12 years of experience building systems are in between.

Hope you'll like it! :-)