What’s New in Skedler

The release of Skedler in November came with many improvements, such as auto-scaling support for Grafana dashboard layout reports and an updated user interface. In the December release, we came up with more features like Autoscaling support for charts in Kibana and the option to configure proxy URL. We are very proud of these releases, but the team is always looking forward to new ways of making Skedler better for you. We are already improving our product further and wanted you to know about our newly added features and UI.So, before we end the year, we want to update you on the features we released and go through some of the important ones in this blog.

Halt your reporting schedules for Specific Days

Want to make sure you are not sending your reports on a holiday? We got you covered! You can now choose the days you do not wish to schedule reports with our new Weekday feature.

Weekday feature

Autoscaling support for charts in Kibana

Skedler now supports autoscaling of charts in Kibana. You do not have to worry about your reports being messy or missing out on important information when you add more data to your chart because Skedler will automatically take care of that.

Autoscaling in Kibana

Added an auto-scaling support for Grafana dashboard layout reports 

You can now stop worrying about your graphs and modules getting distorted in your reports as Skedler has added auto-scaling support for generating reports from Grafana Dashboard.

Autoscaling in Grafana

 Added a privilege to super admin users to change their email id

Super Admins can now update their email ID in their profile. You can add a new Mail ID instead of the one you used when you opened your account.

Super Admin User

 Generate reports using Grafana dashboard timezone

You can now generate reports in Skedler as per your Grafana time window by selecting “use dashboard time” in Skedler. You do not have to worry about missing or skipping any reports.

Dashboard Timezone

Support for fiscal year time window in Grafana dashboards. 

Grafana 8.2  has the option of the configurable fiscal year in the time picker. This option enables fiscal quarters as time ranges for business-focused and executive dashboards. Skedler now supports this feature too!

Fiscal Time Year Window

Added support for Outlook SMTP

Skedler now supports Outlook. So you can set up Outlook as your notification channel in your Skedler account.

Outlook SMTP

These are just some of the new features of Skedler. For more details on these features, do check out our release notes.

If you would like to stay updated on the latest release news or know about upcoming features, please feel free to reach out to the team and keep an eye out for our monthly newsletters.

Kibana Single Sign-On with OpenId Connect and Azure Active Directory

Introduction

Open distro supports OpenID so you can seamlessly connect your Elasticsearch cluster with Identity Providers like Azure AD, Keycloak, Auth0, or Okta. To set up OpenID support, you just need to point Open distro to the metadata endpoint of your provider, and all relevant configuration information is imported automatically. In this article, we will implement a complete OpenID Connect setup including Open distro for Kibana Single Sign-On.

What is OpenID Connect?

OpenID Connect 1.0 is a simple identity layer on top of the OAuth 2.0 protocol. It allows Clients to verify the identity of the End-User based on the authentication performed by an Authorization Server, as well as to obtain basic profile information about the End-User in an interoperable and REST-like manner.

OpenID Connect allows clients of all types, including Web-based, mobile, and JavaScript clients, to request and receive information about authenticated sessions and end-users. The specification suite is extensible, allowing participants to use optional features such as encryption of identity data, the discovery of OpenID Providers, and session management, when it makes sense for them.

Configuring OpenID Connect in Azure AD

Next, we will set up an OpenID Connect client application in Azure AD which we will later use for Open Distro for Elasticsearch Kibana Single Sign-On. In this post, we will just describe the basic steps.

Adding an OpenID Connect client application

Our first step is, we need to register an application with the Microsoft identity platform that supports OpenID Connect. Please refer to the official documentation.

Login to azure ad and open the Authentication tab in-app registrations and enter the redirect URL as https://localhost:5601/auth/openid/login and save it.

redirect URL – https://localhost:5601/auth/openid/login

Besides the client ID, we also need the client secret in our Open Distro for elasticsearch Kibana configuration. This is an extra layer of security. An application can only obtain an id token from the IdP if it provides the client secret. In Azure AD you can find it under the Certificates & secrets tab of the client settings.

Connecting OpenDistro with Azure AD

For connecting Open Distro with Azure AD we need to set up a new authentication domain with type openid in config.yml. The most important information we need to provide is the Metadata Endpoint of the newly created OpenID connect client. This endpoint provides all configuration settings that Open Distro needs. The URL of this endpoint varies from IdP to IdP. In Azure AD the format is:

openId end point IDP – https://login.microsoftonline.com/xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx/v2.0/.well-known/openid-configuration

Since we want to connect Open Distro for Elasticsearch Kibana with Azure AD, we also add a second authentication domain which will use the internal user database. This is required for authenticating the internal Kibana server user. Our config.yml file now looks like:

authc: 

          basic_internal_auth_domain: 

              http_enabled: true 

              transport_enabled: true 

              order: 0 

              http_authenticator: 

                 type: “basic” 

                 challenge: false 

              authentication_backend: 

                 type: “internal” 

          openid_auth_domain: 

              enabled: true 

              order: 1 

              http_authenticator: 

                 type: openid 

                 challenge: false 

                 config: 

                     subject_key: preferred_username 

                     roles_key: roles 

                     openid_connect_url: https://login.microsoftonline.com/xxxxxxxx-xxxx-xxxx-xxxx-

xxxxxxxx/v2.0/.well-known/openid-configuration 

              authentication_backend: 

                   type: noop

Adding users and roles to Azure AD

While an IDP can be used as a federation service to pull in user information from different sources such as LDAP, in this example we use the built-in user management. We have two choices when mapping the Azure AD users to Open Distro roles. We can do it by username, or by the roles in Azure AD. While mapping users by name is a bit easier to set up, we will use the Azure AD roles here.

With the default configuration, two appRoles are created, skedler_role and guidanz_role, which can be viewed by choosing the App registrations menu item within the Azure Active Directory blade, selecting the Enterprise application in question, and clicking the Manifest button

A manifest is a JSON object that looks similar to:

“appId”: “xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx”,

“appRoles”: [

  {

   “allowedMemberTypes”: [

    “User”

   ],

   “description”: “Skedler with administrator access”,

   “displayName”: “skedler_role”,

   “id”: “xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx”,

   “isEnabled”: true,

   “value”: “skedlerrole”

  },

           {

   “allowedMemberTypes”: [

    “User”

   ],

   “description”: “guidanz with readonly access”,

   “displayName”: “guidanz_role”,

   “id”: “xxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxx”,

   “isEnabled”: true,   

   “value”: “guidanzrole”

  },

         ], … etc.  

 }

There are many different ways we might decide to map how users within AAD will be assigned roles within Elasticsearch, for example, using the tenantid claim to map users in different directories to different roles, using the domain part of the name claim, etc.

With the role OpenID connect token attribute created earlier, however, the appRole to which an AAD user is assigned will be sent as the value of the Role Claim within the OpenID connect token, allowing:

  • Arbitrary appRoles to be defined within the manifest
  • Assigning users within the Enterprise application to these roles
  • Using the Role Claim sent within the SAML token to determine access within Elasticsearch.

For the purposes of this post, let’s define a Superuser role within the appRoles:

{

  “appId”: “<guid>”,

  “appRoles”: [

    {

      “allowedMemberTypes”: [

        “User”

      ],

      “displayName”: “Superuser”,

      “id”: “18d14569-c3bd-439b-9a66-3a2aee01d14d”,

      “isEnabled”: true,

      “description”: “Superuser with administrator access”,

      “value”: “superuser”

    },

    … other roles

  ],

  … etc.

And save the changes to the manifest:

Configuring OpenID Connect in Open Distro for Kibana

The last part is to configure OpenID Connect in Open Distro for Kibana. Configuring the Kibana plugin is straight-forward: Choose OpenID as the authentication type, and provide the Azure AD metadata URL, the client name, and the client secret. Please refer to the official documentation.

Activate OpenID Connect by adding the following to kibana.yml:

opendistro_security.auth.type: “openid”

opendistro_security.openid.connect_url: “https://login.microsoftonline.com/xxxxx-xxxx-xxxx-xxxx-xxxxxxxxx/v2.0/.well-known/openid-configuration”

opendistro_security.openid.client_id: “xxxxx-xxxx-xxxx-xxxx-xxxxxxxxx” 

opendistro_security.openid.client_secret: “xxxxxxxxxxxxxxxxxxxxxxxxxxx”

opendistro_security.openid.base_redirect_url: “https://localhost:5601”

Done. We can now start Open Distro for Kibana and enjoy Single Sign-On with Azure AD! If we open Kibana, we get redirected to the login page of Azure AD. After providing username and password, Kibana opens, and we’re logged in.

Summary

OpenID Connect is an industry-standard for providing authentication information. Open Distro for Elasticsearch and their Open Distro for Kibana plugin support OpenID Connect out of the box, so you can use any OpenID compliant identity provider to implement Single Sign-On in Kibana. These IdPs include Azure AD, Keycloak, Okta, Auth0, Connect2ID, or Salesforce.

Reference

If you wish to have an automated reporting application, we recommend downloading  Skedler Reports.

Installing, configuring Skedler Reports as Kibana Plugin with Elasticsearch and Kibana Environment using Docker Compose

Introduction

If you are using ELK stack, you can now install Skedler as a Kibana plugin. Skedler Reports plugin is available for Kibana versions from 6.5.x to 7.6.x.

Let’s take a look at the steps to Install Skedler Reports as a Kibana plugin.

Prerequisites:

  1. A Linux machine
  2. Docker Installed
  3. Docker Compose Installed

Let’s get started!

Login to your Linux machine and update the repository and install Docker and Docker Compose. Then follow the below steps to update the Repository:

Setting Up Skedler Reports

Create a Directory, say skedlerplugin

ubuntu@guidanz:~$ mkdir skedlerplugin

ubuntu@guidanz:~$ cd skedlerplugin/

ubuntu@guidanz:~$ vim docker-compose.yml

Now, create a Docker Compose file for Skedler Reports. You also need to create a Skedler Reports configuration file, reporting.yml, and a Docker Compose file for Skedler as below,

version: “2.4”

services:

#  Skedler Reports container

  reports:

    image: skedler/reports:latest

    container_name: reports

    privileged: true

    cap_add:

      – SYS_ADMIN

    volumes:

      – /sys/fs/cgroup:/sys/fs/cgroup:ro

      – reportdata:/var/lib/skedler

      – ./reporting.yml:/opt/skedler/config/reporting.yml

    command: /opt/skedler/bin/skedler

    depends_on:

      elasticsearch: { condition: service_healthy }

    ports:

      – 3000:3000

    healthcheck:

      test: [“CMD”, “curl”, “-s”, “-f”, “http://localhost:3000”]

    networks: [‘stack’]

volumes:

  reportdata:

    driver: local

networks: {stack: {}}

Create an Elasticsearch configuration file – reporting.yml and paste the config as below.

ubuntu@guidanz:~$ mkdir skedlerplugin

ubuntu@guidanz:~$ cd skedlerplugin/

ubuntu@guidanz:~$ vim reporting.yml

Download the reporting.yml file found here

Setting Up Elasticsearch

You also need to create an Elasticsearch configuration file, elasticsearch.yml. Docker Compose file for Elasticsearch is below,

#Elasticsearch container

  elasticsearch:

    container_name: elasticsearch

    hostname: elasticsearch

    image: “docker.elastic.co/elasticsearch/elasticsearch:7.6.0”

    logging:

      options:

        max-file: “3”

        max-size: “50m”

    environment:

      – http.host=0.0.0.0

      – transport.host=127.0.0.1

      – bootstrap.memory_lock=true

      – “ES_JAVA_OPTS=-Xms${ES_JVM_HEAP} -Xmx${ES_JVM_HEAP}”

    mem_limit: 1g

    ulimits:

      memlock:

        soft: -1

        hard: -1

    volumes:

      – ./elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml

      – esdata:/usr/share/elasticsearch/data

    ports: [‘9200:9200’]

    healthcheck:

      test: [“CMD”, “curl”,”-s” ,”-f”, “http://localhost:9200/_cat/health”]

    networks: [‘stack’]

volumes:

  esdata:

    driver: local

networks: guidanz

Create an Elasticsearch configuration file elasticsearch.yml and paste the config as below.

cluster.name: guidanz-stack-cluster

node.name: node-1

network.host: 0.0.0.0

path.data: /usr/share/elasticsearch/data

http.port: 9200

xpack.monitoring.enabled: true

http.cors.enabled: true

http.cors.allow-origin: “*”

http.max_header_size: 16kb

Setting Up Skedler Reports as Kibana Plugin

Create a Directory inside skedlerplugin, say kibanaconfig

ubuntu@guidanz:~$ mkdir kibanaconfig

ubuntu@guidanz:~$ cd kibanaconfig/

ubuntu@guidanz:~$ vim Dockerfile

Now, create a Docker file for Kibana and check the Docker file for Kibana as below,

FROM docker.elastic.co/kibana/kibana:7.6.0

RUN ./bin/kibana-plugin install https://www.skedler.com/plugins/skedler-reports-plugin/4.10.0/skedler-reports-kibana-plugin-7.6.0-4.10.0.zip

Then, copy the URL of the Skedler Reports plugin matching your exact Kibana version from here.

You also need to create a Docker Compose file for Kibana is below,

#Kibana container

  kibana:

    container_name: kibana

    hostname: kibana

    build:

      context: ./kibanaconfig

      dockerfile: Dockerfile

    image: kibanaconfig

    logging:

      options:

        max-file: “3”

        max-size: “50m”

    volumes:

      – ./kibanaconfig/kibana.yml:/usr/share/kibana/config/kibana.yml

      – ./kibanaconfig/skedler_reports.yml:/usr/share/kibana/plugins/skedler/config/skedler_reports.yml

    ports: [‘5601:5601’]

    networks: [‘stack’]

    depends_on:

      elasticsearch: { condition: service_healthy }

    restart: on-failure

    healthcheck:

      test: [“CMD”, “curl”, “-s”, “-f”, “http://localhost:5601/”]

      retries: 6

Create a Kibana configuration file kibana.yml inside the kibanaconfig folder and paste the config as below.

ubuntu@guidanz:~$ cd kibanaconfig/

ubuntu@guidanz:~$ vim kibana.yml

server.port: 127.0.0.1:5601

elasticsearch.url: “http://elasticsearch:9200”

server.name: “full-stack-example”

xpack.monitoring.enabled: true

Create a Skedler Reports as Kibana Plugin configuration file skedler_reports.yml inside the kibanaconfig folder and paste the config as below.

ubuntu@guidanz:~$ cd kibanaconfig/

ubuntu@guidanz:~$ vim skedler_reports.yml

#/*********** Skedler Access URL *************************/

skedler_reports_url: “http://ip_address:3000”

#/*********************** Basic Authentication *********************/

# If Skedler Reports uses any username and password

#skedler_username: user

#skedler_password: password

Configure the Skedler Reports server URL in the skedler_reports_url variable. By default, the variable is set as shown below,

If the Skedler Reports server URL requires basic authentication, for example, Nginx, uncomment and configure the skedler_username and skedler_password with the basic authentication credentials as shown below: Now run the docker-compose.

ubuntu@guidanz:~/skedlerplugin$ docker-compose up -d

Access Skedler Reports the IP and Port and you will see the Skedler Reports UI.

| http://ip_address:3000

Access Elasticsearch the IP and Port and you will see the Elasticsearch UI.

| http://ip_address:9200

Access Kibana using the IP and Port and you will see the Kibana UI.

| http://ip_address:5601

So now the Composite docker-compose file will look like below,

You can Simply do compose up and down.

version: “2.4”

services:

#  Skedler Reports container

  reports:

    image: skedler/reports:latest

    container_name: reports

    privileged: true

    cap_add:

      – SYS_ADMIN

    volumes:

      – /sys/fs/cgroup:/sys/fs/cgroup:ro

      – reportdata:/var/lib/skedler

      – ./reporting.yml:/opt/skedler/config/reporting.yml

    command: /opt/skedler/bin/skedler

    depends_on:

      elasticsearch: { condition: service_healthy }

    ports:

      – 3000:3000

    healthcheck:

      test: [“CMD”, “curl”, “-s”, “-f”, “http://localhost:3000”]

    networks: [‘stack’]

#  Elasticsearch container

  elasticsearch:

    container_name: elasticsearch

    hostname: elasticsearch

    image: “docker.elastic.co/elasticsearch/elasticsearch:7.1.1”

    logging:

      options:

        max-file: “3”

        max-size: “50m”

    environment:

      – http.host=0.0.0.0

      – transport.host=127.0.0.1

      – bootstrap.memory_lock=true

      – “ES_JAVA_OPTS=-Xms${ES_JVM_HEAP} -Xmx${ES_JVM_HEAP}”

    mem_limit: ${ES_MEM_LIMIT}

    ulimits:

      memlock:

        soft: -1

        hard: -1

    volumes:

      – ./config/elasticsearch/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml

      – esdata:/usr/share/elasticsearch/data

    ports: [‘9200:9200’]

    healthcheck:

      test: [“CMD”, “curl”,”-s” ,”-f”, “http://localhost:9200/_cat/health”]

    networks: [‘stack’]

 #Kibana container

  kibana:

    container_name: kibana

    hostname: kibana

    build:

      context: ./kibanaconfig

      dockerfile: Dockerfile

    image: kibanaconfig

    logging:

      options:

        max-file: “3”

        max-size: “50m”

    volumes:

      – ./kibanaconfig/kibana.yml:/usr/share/kibana/config/kibana.yml

      – ./kibanaconfig/skedler_reports.yml:/usr/share/kibana/plugins/skedler/config/skedler_reports.yml

    ports: [‘5601:5601’]

    networks: [‘stack’]

    depends_on:

      elasticsearch: { condition: service_healthy }

    restart: on-failure

    healthcheck:

      test: [“CMD”, “curl”, “-s”, “-f”, “http://localhost:5601/”]

      retries: 6

volumes:

  esdata:

    driver: local

  reportdata:

    driver: local

networks: {stack: {}}

You can Simply do compose up and down.

ubuntu@guidanz:~/skedlerplugin$ docker-compose down 

ubuntu@guidanz:~/skedlerplugin$ docker-compose up -d

Summary

Docker compose is a useful tool to manage container stacks for your client. And manage all related containers with one single command.

The Best Tools for Exporting Elasticsearch Data from Kibana

As a tool for visualizing elasticsearch data, Kibana is a perfect choice. Its UI interface allows creating a dashboard, search, and visualizations in minutes and analyzing the data with its help.

Despite having tons of visualizations, the open source version of Kibana does not have advanced reporting capability. Automating export of data into CSV, Excel, or PDF requires additional plugins.  

We wrote an honest and unbiased review of the following tools that are available for exporting data directly from Elasticsearch.

  1. Flexmonster Pivot plugin for Kibana 
  2. Sentinl (for Kibana)
  3. Skedler Reports

1. Flexmonster Pivot plugin for Kibana

https://github.com/flexmonster/pivot-kibana

Flexmonster Pivot covers the need in summarizing business data and displaying results in a cross-table format interactively & fast. All these Excel-like features, to which so many of you are used to, and its extended API will multiply your analytics results remarkably.

Though initially created as a pivot table component that can be incorporated into any app that uses JavaScript, it can serve as a part of Kibana as well. You can connect it to the Elasticsearch index, fetch the documents from it and start exploring the data.

Pros of Flexmonster Pivot plugin for Kibana

  • Flexmonster is in line with the concept of Kibana
  • Simply embeddable Pivot for Kibana

Cons of Flexmonster Pivot plugin for Kibana

  • To automate the exporting of data on a periodic basis, you need to write your own cron job.
  • Flexmonster Pivot plugin installation is a bit tricky. 

2. Sentinl (for Kibana)

https://github.com/sirensolutions/sentinl

SENTINL extends Kibana with Alerting and Reporting functionality to monitor, notify and report on data series changes using standard queries, programmable validators and a variety of configurable actions – Think of it as a free and independent “Watcher” which also has scheduled “Reporting”.

SENTINL is also designed to simplify the process of creating and managing alerts and reports in Siren Investigate/Kibana 6.x via its native App Interface, or by using native watcher tools in Kibana 6.x+.

Pros of Sentinl

  • It’s simple to install and configure
  • Added as a Kibana plugin.

Cons of Sentinl

  • This tool supports only 6x versions of Elasticsearch.  It does not support 7.x.
  • For non-technical users, it’s difficult to use 
  • Automation requires scripting which makes it laborious

3. Skedler Reports

https://www.skedler.com/

Disclosure: Skedler Reports is one of our products.

Skedler offers a simple and easy to add reporting and alerting solution for Elastic Stack and Grafana.  There is also a plugin for Kibana that is easy to install and use with the Elasticsearch data. It’s called Skedler Reports as Kibana Plugin. 

Pros of Skedler Reports

  • Simple to install, configure, and use
  • Send HTML, PDF, XLS, CSV reports on-demand or periodically via email or #slack
  • Report setup takes less than 5 minute
  • Easy to use, no coding required

Cons of Skedler Reports

  • It requires a paid license which includes software and also enterprise support
  • Installation is difficult for users who are not fully familiar with Elastic Stack or Grafana

What tools do you use?

Do you have to regularly export data from Kibana for external analysis or reporting purposes? Do you use any other third-party plugins?   Email us about the tool at hello at skedler.com.

4 steps to put your Security Onion reports on auto-pilot!

If you are running Elastic Stack with Security Onion for Intrusion Detection and Enterprise Security Monitoring, you already know the importance of report automation. Whether compliance or ad hoc, MSSPs can sink countless hours into reporting. The manual process of building and sending those reports means valuable resources are taken away from the role they were hired to perform.

With Skedler, now a Security Onion user with any user privilege can generate a report and automate its distribution. All you need is the Elasticsearch and Kibana Admin credentials to connect your Security Onion environment with Skedler. The focus of this blog post will be on how to email PDF, PNG, HTML Inline, Excel or CSV reports from the Security Onion using Skedler’s integration with Kibana.

What is Security Onion?

If you’ve never heard about Security Onion before, it is a Linux distro for Intrusion Detection, Network Security Monitoring, and Log Management. Security Onion by Doug Burks is a Ubuntu-based distribution containing many security tools such as Snort, Bro, OSSEC, Sguil, Squert, and more. The distribution allows an analyst to configure and run an intrusion detection system with complete monitoring and reporting capability in a few minutes.

Source: Security Onion website

Why Skedler Reports for Security Onion?

With Skedler, MSSPs can generate compliance reports (e.g. PCI ASV reports) quickly and easily to save countless man-hours, deliver reports 10x faster, and enable their customers to mitigate vulnerabilities more quickly. You can use filters to create specific reports for specific projects, allow users from high-level executives to technicians, and schedule reports to be delivered at any time.

Skedler Reports offers the most powerful, flexible, and easy-to-use data monitoring solution that companies use to exceed customer SLAs, achieve compliance, and empower internal IT and business leaders.

By using Skedler Reports, you can enjoy the following benefits:

  • Simple installation, quick configuration, faster deployment
  • Send visually appealing, personalized reports
  • Report setup takes less than 5 minute
  • Send PDF, PNG, HTML Inline, Excel or CSV reports on-demand or periodically via email or slack channel.
  • Help users see and understand data faster with customized mobile & print-ready reports

How to generate reports from Security Onion using Skedler?

There are four basic steps to start generating Security Onion reports using Skedler:

Install Skedler

The obvious first step is installing Skedler on your machine. To download Skedler, you can click on this link and enter the required information. Once downloaded, you can start the installation depending on the OS type. We support Debian, Docker, Kubernetes, Linux, macOS, and Windows. You can refer to this Installation Guide to know more about the steps.

Activate Skedler

After installation, the next step is activation. An email will be sent to you after the download containing a license key. Using this key, you can activate Skedler both online and offline. Here is a sneak peak of the online activation steps:

Short preview of Online Activation

You can watch the video tutorials for Online Activation and Offline Activation. If you wish to read the docs instead, you can find them here.

Connect Security Onion with Skedler

Skedler is now ready to start generating reports from any data source of your choice. It just needs you to connect it with the same. It takes less than a minute to connect any data source to Skedler. Moreover, you get to choose if the data source credentials will be embedded or prompted to the user to grant access.

Adding Security Onion with Kibana data source

Check out this quick tutorial video to see how easy it is!

Generate Security Onion Reports

Here comes the fun part! Without using a single line of code, now you can automate your Threat Analysis report, Vulnerability Report, and Network Traffic Analysis Report from Security Onion and share it with the right audience at the right time. There are three steps to generating a report:

Report Designing

With Skedler, you can design the report with text, parameters, elements as well as images. You can add your company logo to these reports and report names to create more credibility among your customers and other stakeholders. 

Adding company logo and adding report name using auto-parameters

Check out how easy it is to add charts from your Kibana dashboard to these Skedler report:

Using drag-n-drop feature to add the charts. Resize them as required.

Other options available at the design stage are:

a. Adding Burst Filter: This filter can be used to use one dashboard and send reports to multiple customers at the same time based on different dashboard queries.

b. Selecting Time Window: You can choose between selecting any particular time frame or using the dashboard time window. 

Report Scheduling

Once the report design is completed, we can set the Schedule. Here, you can set the recurrence and frequency. You also get the option of adding holidays. The export options include PNG, HTML Inline, Excel, and CSV.

Report Distribution

Skedler allows seamless distribution via Email as well as Slack channel. For the email channel, you can add the recipients and use parameters to customize the subject or body of the email. Similarly, for Slack, you can select the channel or the direct recipient to receive these reports upon generation.

These reports can be generated, downloaded, and mailed at any time irrespective of the schedule. You can share the report with any user within the organization. You can edit the report design or schedule and check the history of these reports as well.

To see the Security Onion report generation in action, check out this step-by-step tutorial.

Summary

This blog was a very quick overview of how to automate reports from Security Onion Dashboards using Skedler. We have accumulated a series of documentation and videos for you to check out all of the above-mentioned information in detail. If you haven’t already, download Skedler now and try it free for 15 days.

Tabular Reports from Elastic Stack – New in Skedler Reports v4.4

We are excited to announce the release of Skedler Reports v4.4. As always, it’s packed with capabilities to help you meet compliance, audit, and snapshot reporting requirements.

Tabular PDF, Excel, CSV Reports from Kibana Data Table

If you are a security analyst or network admin looking for the list of unauthorized IP addresses connecting to your machines, Skedler can deliver the data to you in the form of PDF or Excel. With just a couple of clicks, schedule a PDF and/or Excel report that uses the Kibana data table as a source, sit back and have the reports delivered to your stakeholders automatically!

[video_embed video=”l-4JSKe9ee4″ parameters=”” mp4=”” ogv=”” placeholder=”” width=”700″ height=”400″]

Schedule Reports with Custom Time Ranges

If your customer needs a daily report that summarizes the top security events during the work hours of 9 AM – 5 PM, you can send it to them right away. Simply create a custom time range in Kibana and customize your dashboard to use this time range.  In Skedler, schedule a daily report with the dashboard as a data source and you’re all set!

Here is the list of additional features in the new release:

  • You can use the latest features in Elastic Stack 7.3 and Grafana 6.3 and generate reports with Skedler.
  • Users do not need administrator privileges to configure Grafana as a data source in Skedler.

Go Ahead and Try it Out

Test out the data table reports with custom time ranges in ELK 7.3 or Grafana 6.3 environment! Start now below by doing the following:

  1. Download Skedler Reports
  2. Follow the simple steps in our documentation and start generating reports.

An Easy Way to Export / Import Dashboards, Searches and Visualizations from Kibana

Introduction

Manually recreating Kibana dashboards, searches, and visualizations during upgrades, production deployment or recovery is a time-consuming affair. The easiest way to recreate the prebuilt Kibana dashboard and other objects is by exporting and importing dashboards, searches, and visualizations. This can be achieved by using,

  • Kibana API (available since Kibana 7.x) 
  • Kibana UI

If are you looking to export and import the Kibana dashboards and its dependencies automatically, we recommend the Kibana API’s. Also, you can export and import dashboard from Kibana UI.

Note: User should add the dependencies of the dashboards like visualization, index pattern individually while exporting or importing from Kibana UI.

Export Objects From Kibana API

The export API enables you to retrieve a set of saved objects that can later be imported into Kibana.

Request

POST /api/saved_objects/_export

Request Body

At least type or objects must be passed in within the request body.

type (optional)

(array/string) The saved object type(s) that the export should be limited to.

The following example exports all index pattern saved objects.

POST api/saved_objects/_export { “type”: “index-pattern” }

Example Curl:

curl -X POST “http://localhost:5601/api/saved_objects/_export” -H ‘kbn-xsrf: true’ -H ‘Content-Type: application/json’ -d’ { “type”: “index-pattern” } ‘

objects (optional)

(array) A list of objects to export

The following example exports specific saved objects.

POST api/saved_objects/_export

{

  “objects”: [

    {

      “type”: “dashboard”,

      “id”: “be3733a0-9efe-11e7-acb3-3dab96693fab”

    }  ]

}

Example Curl:

curl -X POST “http://localhost:5601/api/saved_objects/_export” -H ‘kbn-xsrf: true’ -H ‘Content-Type: application/json’ -d’ { “objects”: [ { “type”: “dashboard”, 

“id”: “be3733a0-9efe-11e7-acb3-3dab96693fab” } ] } ‘

Response Body

The response body will have a format of newline delimited JSON and the successful call returns a response code of 200 along with the exported objects as the response body.

Import Objects From Kibana API

The import API enables you to create a set of Kibana saved objects from a file created by the export API.

Request

POST /api/saved_objects/_import

Request Body

The request body must be of type multipart/form-data.

File

A file exported using the export API.

Example

The following example imports an index pattern and dashboard.

curl -X POST “localhost:5601/api/saved_objects/_import” -H “kbn-xsrf: true” –form file=@file.ndjson

The file.ndjson file would contain the following.

{“type”:”index-pattern”,”id”:”my-pattern”,”attributes”:{“title”:”my-pattern-*”}}

{“type”:”dashboard”,”id”:”my-dashboard”,”attributes”:{“title”:”Look at my dashboard”}

Response Body

A successful call returns a response code of 200 and a response body containing a JSON structure similar to the following example:

{

  “success”: true,

  “successCount”: 2

}

Export Objects From Kibana UI:

You can now export your objects from Kibana UI under Management > Saved Objects > Export. Select the checkboxes of the objects you want to export, and click Export. Or to export objects by type:

  • Click Export objects.
  • Select the object types you want to export.
  • Click Export All.
kibana export

Import Objects From Kibana UI:

 You can import your JSON file from Kibana UI under Management > Saved Objects > Import. Follow the below steps to import your 

  • Click Import.
  • Navigate to the JSON file that represents the objects to import.
  • Indicate whether to overwrite objects already in Kibana.
  • Click Import.
kibana

Summary:

Exporting and importing the saved objects from the Kibana is an effective and easiest way to recreate dashboards and other objects in new environments or during migrations.

If you are looking to automate and make the process simpler,  we recommend using the Kibana APIs or else you can use the Kibana UI for granular export and import.

If you are looking for a Kibana reporting solution, be sure to test drive Skedler.

Skedler v4.1: Next Generation Reporting for Elasticsearch Kibana 7.0 and Grafana 6.1 is here

We are excited to announce that we have just released version 4.1 of Skedler Reports!  

[button title=”Download Skedler 4.1 Now” icon=”” icon_position=”” link=”https://www.skedler.com/download/” target=”_blank” color=”#800080″ font_color=”#000″ large=”0″ class=”v4download” download=”” onclick=””]

Self Service Reporting Solution for Elasticsearch Kibana 7.0 and Grafana 6.1

We understand that your stakeholders and customers need intuitive and flexible options to save time in receiving the data that matters to them and we’ve achieved exactly that with the release of Skedler 4.1.  The newly enhanced UI offers a delightful user experience for creating and scheduling reports from your Elasticsearch Kibana 7.0 and Grafana 6.1 .

[video_embed video=”4flSLj5q1yk” parameters=”” mp4=”” ogv=”” placeholder=”” width=”700″ height=”400″]

Multi-Tenancy Capabilities

If you are a service provider, you need a simple and automated way to provide different groups of users (i.e. “tenants”) with access to different sets of data. Skedler 4.1’s powerful and secure multi-tenancy capabilities will now allow you to send reports to your customers from your multi-tenant analytics application within minutes.  Supported with Search Guard, Open Distro & X-Pack.

Intuitive and Mobile Ready Reports

Skedler 4.1 will now allow you to produce high-resolution HTML reports from Elasticsearch Kibana and Grafana that will make it easy and convenient for your end users to access to critical data through their mobile devices and email clients. No more cumbersome and large PDF attachments.

[video_embed video=”soFITSdyDdE” parameters=”” mp4=”” ogv=”” placeholder=”” width=”700″ height=”400″]

The latest release also includes:

  • Support for the latest and greatest version of Elastic Stack and Grafana. Skedler 4.1 supports the following versions:
    • Elastic stack 6.7 and 7.0
    • Grafana 6.1.x
    • Open distro for Elasticsearch 6.7 and 7.0.  

Please continue to send us feedback for what new capabilities you’d like to see in the future by reaching out to us at hello@skedler.com

Simplifying Skedler Reports with Elasticsearch and Kibana Environment using Docker Compose

Docker compose is a tool for defining and running multi-container (Skedler Reports, Elasticsearch and Kibana) Docker applications.  With Compose, you use a YAML file to configure your application’s services. Then with a single command, you create and start all the services from your configuration.

In this section, I will describe how to create a containerized installation for Skedler Reports, Elasticsearch and Kibana.

Benefits:

  • You describe the multi-container set up in a clear way and bring up the containers in a single command.
  • You can define the priority and dependency of the container on other containers.

Step-by-Step Instruction:

Step 1: Define services in a Compose file:

Create a file called docker-compose.yml in your project directory and paste the following

docker-compose.yml:

version: “2.4”

services:

#  Skedler Reports container

  reports:

    image: skedler/reports:latest

    container_name: reports

    privileged: true

    cap_add:

      – SYS_ADMIN

    volumes:

      – /sys/fs/cgroup:/sys/fs/cgroup:ro

      – reportdata:/var/lib/skedler

      – ./reporting.yml:/opt/skedler/config/reporting.yml

    command: /opt/skedler/bin/skedler

    depends_on:

      elasticsearch: { condition: service_healthy }

    ports:

      – 3000:3000

    healthcheck:

      test: [“CMD”, “curl”, “-s”, “-f”, “http://localhost:3000”]

    networks: [‘stack’]

Elasticsearch container

  elasticsearch:

    container_name: elasticsearch

    hostname: elasticsearch

    image: “docker.elastic.co/elasticsearch/elasticsearch:7.1.1”

    logging:

      options:

        max-file: “3”

        max-size: “50m”

    environment:

      – http.host=0.0.0.0

      – transport.host=127.0.0.1

      – bootstrap.memory_lock=true

      – “ES_JAVA_OPTS=-Xms${ES_JVM_HEAP} -Xmx${ES_JVM_HEAP}”

    mem_limit: ${ES_MEM_LIMIT}

    ulimits:

      memlock:

        soft: -1

        hard: -1

    volumes:

      – ./config/elasticsearch/elasticsearch.yml:/usr/share/elasticsearch/config/elasticsearch.yml

      – esdata:/usr/share/elasticsearch/data

    ports: [‘9200:9200’]

    healthcheck:

      test: [“CMD”, “curl”,”-s” ,”-f”, “http://localhost:9200/_cat/health”]

    networks: [‘stack’]

 #Kibana container

  kibana:

    container_name: kibana

    hostname: kibana

    image: “docker.elastic.co/kibana/kibana:7.1.1”

    logging:

      options:

        max-file: “3”

        max-size: “50m”

    volumes:

      – ./config/kibana/kibana.yml:/usr/share/kibana/config/kibana.yml

    ports: [‘5601:5601’]

    networks: [‘stack’]

    depends_on:

      elasticsearch: { condition: service_healthy }

    restart: on-failure

    healthcheck:

      test: [“CMD”, “curl”, “-s”, “-f”, “http://localhost:5601/”]

      retries: 6

volumes:

  esdata:

    driver: local

  reportdata:

    driver: local

networks: {stack: {}}

This Compose file defines three services, Skedler Reports, Elasticsearch and Kibana.

Step 2: Basic configurations using reporting.yml and kibana.yml

Create a files called reporting.yml in your project directory.

Getting the reporting.yml file found here

Note: For more configuration options kindly refer the article reporting.yml and ReportEngineOptions Configuration

Create a files called kibana.yml in your project directory.

Note: For more configuration options kindly refer the article kibana.yml

Step 3: Build and run your app with docker-compose

From your project directory, start up your application by running

sudo docker-compose up -d

Compose pulls a Skedler Reports, Elasticsearch and Kibana images, builds an image for your code, and starts the services you defined

Skedler Reports is available at http://<hostIP>:3000,  Elasticsearch is available at http://<hostIP>:9200 and Kibana is available at http://<hostIP>:5601 .

Summary

Docker compose is a useful tool to manage container stacks for your client. And manage all related containers with one single command.

Skedler Update: Version 3.9 Released

Skedler Update: Version 3.9 Released

Here’s everything you need to know about the new Skedler v3.9. Download the update now to take advantage of its new features for both Skedler Reports and Alerts.

What’s New With Skedler Reports v3.9

  • Support for:
    • ReadOnlyRest Elasticsearch/Kibana Security Plugin.
    • Chromium web browser for Skedler report generation.
    • Report bursting in Grafana reports if the Grafana dashboard is set with Template Variables.
    • Elasticsearch version 6.4.0 and Kibana version 6.4.0.
  • Ability to install Skedler Reports through Debian and RPM packages.
  • Simplified installation levels of Skedler Reports here.
  • Upgraded license module
    • NOTE: License reactivation is required when you upgrade Skedler Reports from the older version to the latest v3.8. Refer to this URL to reactivate the Skedler Reports license key.
    • Deactivation of Skedler license key in UI

What’s New With Skedler Alerts v3.9

  • Support for:
    • Installing Skedler Alerts via Debian and RPM packages.
    • GET method type in Webhook.
    • Elasticsearch 6.4.0.
  • Simplified installation levels of Skedler. Refer to this URL for installation guides.
  • Upgraded license module:
    • NOTE: License reactivation is required when you upgrade Skedler Alerts from the older version to the latest v3.8. Refer to this URL to reactivate the Skedler Alerts license key.
  • Deactivation of Skedler Alerts license key in UI

 

Get Skedler Reports

Download Skedler Reports

Get Skedler Alerts

Download Skedler Alerts

 

Copyright © 2023 Guidanz Inc
Translate »