Install Oracle SOA Suite 12c in Cluster


This document is a reference guide for the installation and configuration of Oracle Fusion Middleware 11g products, including Oracle Weblogic 12.1.3, Oracle WebTier 12.1.3.

The process described on this document is based on Oracle’s SOA Enterprise Deployment Guide 12.2.1

Enviroment Requirements

To follow the procedures described in this document, it is necessary to consider the following requeriments for the servers:

  • Operating System: Red Hat Enterprise Linux 6.x
  • Database: An Oracle Enterprise Edition, 12c database instance should be available for use as metadata repository.
  • Software or Hardware based load balancer available
  • A shared disk presented to all servers that run the domain


Environment Information

Test environment information will be used in this document to describe all procedures.

  • Hosts:
    • soa12cnodo1
    • soa12cnodo2
  • Domain Name: soadomain
  • Domain Home:
    • /opt/oracle/Middleware/admin/domains/soadomain/aserver/soadomain/
    • /opt/oracle/Middleware/admin/domains/soadomain/mserver/soadomain/
  • Java Home: /opt/oracle/jdk1.8.0_65
  • Middleware Home: /opt/oracle/Middleware/mwhome
  • Weblogic Home: /opt/oracle/Middleware/mwhome/wlserver
  • SOA_HOME: /opt/oracle/Middleware/mwhome/soa


Oracle Documentation

  • Oracle® Fusion Middleware Enterprise Deployment Guide for
Continue Reading

Configure Rsyslog to send our logs to ELK


We have seen before how to add filters and indexes for Filebeat and Topbeat. But in some cases, we won’t be able to install additional software to manage our logs. That’s when Rsyslog is our best option. In this post we will configure an external log from Apache that is not manage by default for Rsyslog.

Configuring Rsyslog (client side)

We are going to create a new file on /etc/rsyslog.d that will contain our new input log configuration.

$InputFileName /var/log/apache2/access.log #can NOT use wildcards – this is where logstash-forwarder would be nice
$InputFileTag apache-access-rs:  #Logstash throws grok errors if the “:” is anywhere besides at the end; shows up as “Program” in Logstash
$InputFileStateFile apache-access-rs  #can be anything; unique id used by rsyslog
$InputFileSeverity info
$InputFileFacility apacheaccess
$InputFilePollInterval 10
$InputFilePersistStateInterval 1000

apacheaccess.* @@ELK_server_private_IP:5544  #the 2 “@” signs tells rsyslog to use TCP; 1 “@” sign 
Continue Reading

Gather Infrastructure Metrics with Topbeat and ELK on CentOS 7


Topbeat, which is one of the several “Beats” data shippers that helps send various types of server data to an Elasticsearch instance, allows you to gather information about the CPU, memory, and process activity on your servers. In conjunction with an ELK server (Elasticsearch, Logstash, and Kibana), the data that Topbeat gathers can be used to easily visualize metrics so that you can see the status of your servers in a centralized place.

In this tutorial, we will show you how to use an ELK stack to gather and visualize infrastructure metrics by using Topbeat on a CentOS 7 server.


Load Topbeat Index Template in Elasticsearch

Because we are planning on using Topbeat to ship logs to Elasticsearch, we should load the Topbeat index template. The index template will configure Elasticsearch to analyze incoming Topbeat fields in an intelligent way.

First, download the Topbeat index template on your

Continue Reading

Adding Filters to Logstash (ELK stack)


This post has a couple of configuration I needed to a particular environment. I already have my stack working. There is a lot of other filters, patterns and configurations. I will be adding more in time.

Default PATHS

Logstash configuration directory: /etc/logstash/conf.d
Logstash patterns directory: /opt/logstash/patterns

Specific Configuration


Prospector (client side – Filebeat)

This block must be beneath of prospectors section and maintaining the indentation.

        - /var/log/auth.log
        - /var/log/syslog
      input_type: log
      document_type: syslog

Log example

Jun  3 12:17:01 server01 /USR/SBIN/CRON[15365]: (root) CMD (   cd / && run-parts --report /etc/cron.hourly)


There is no specific pattern you should add.


This configuration is inside 10-syslog-filter.conf

filter { 
  if [type] == "syslog" {
    grok {
      match => { "message" => "%{SYSLOGTIMESTAMP:syslog_timestamp} %{SYSLOGHOST:syslog_hostname} %{DATA:syslog_program}(?:\[%{POSINT:syslog_pid}\])?: %{GREEDYDATA:syslog_message}" }
      add_field => [ "received_at", "%{@timestamp}" ]
      add_field => [ "received_from", "%{host}" ]
    syslog_pri { }
    date {
      match => [ "syslog_timestamp", 
Continue Reading

Install Elasticsearch, Logstash, and Kibana (ELK Stack) on CentOS 7


In this tutorial, we will go over the installation of the Elasticsearch ELK Stack on CentOS 7—that is, Elasticsearch 2.3.x, Logstash 2.3.x, and Kibana 4.5.x. We will also show you how to configure it to gather and visualize the syslogs of your systems in a centralized location, using Filebeat 1.1.x.

Logstash is an open source tool for collecting, parsing, and storing logs for future use.

Kibana is a web interface that can be used to search and view the logs that Logstash has indexed. Both of these tools are based on Elasticsearch, which is used for storing logs.

Centralized logging can be very useful when attempting to identify problems with your servers or applications, as it allows you to search through all of your logs in a single place. It is also useful because it allows you to identify issues that span multiple servers by correlating their logs during

Continue Reading