[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: Openshift centralized logging - add custom container logfiles


Am 16.08.2018 um 16:27 schrieb Rich Megginson:
> On 08/16/2018 05:42 AM, Aleksandar Lazic wrote:
>> Am 16.08.2018 um 12:48 schrieb Aleksandar Kostadinov:
>>> Might be real nice to allow pod to request sockets created where different log
>>> streams can be sent to central logging without extra containers in the pod.
>> You can run socklog/fluentbit/... in the background to handle the logging and
>> your app logs to this socket.
> So you would need to configure your app to log to a socket instead of a log file?
> Where does socklog write the logs?  Who reads from that destination?

Socklog writes to stdout by default.
In my setup is the haproxy configured to write to the unix socket but he can
also listen to udp socket.
In any case the output is written to stdout


I have describe the setup in two blog posts

Another possible tool is https://fluentbit.io/ as it can use more input sources.

For example you can use tail if it's not possible to change easily the logging
setup of the app.

In the past was the rsyslog hard to setup for openshift with normal privileges
from the rhel image, that was the reason for me to build this solution, imho.
The https://www.rsyslog.com/doc/v8-stable/configuration/modules/omstdout.html is
documented to not use it in real deployments

Best Regards

>> Something similar as I have done it in my haproxy image.
>> https://gitlab.com/aleks001/haproxy18-centos/blob/master/containerfiles/container-entrypoint.sh#L92-93
>> ###
>> ...
>> echo "starting socklog"
>> /usr/local/bin/socklog unix /tmp/haproxy_syslog &
>> ...
>> ###
>> Regards
>> Aleks
>>> Jeff Cantrill wrote on 08/15/18 16:50:
>>>> The recommended options with the current log stack are either to reconfigure
>>>> your log to send to stdout or add a sidecar container that is capable of
>>>> tailing the log in question which would write it to stdout and ultimately
>>>> read by fluentd.
>>>> On Wed, Aug 15, 2018 at 2:47 AM, Leo David <leoalex gmail com
>>>> <mailto:leoalex gmail com>> wrote:
>>>>      Hi Everyone,
>>>>      I have logging with fluentd / elasticsearch at cluster level running
>>>>      fine,  everything works as expected.
>>>>      I have an issue though...
>>>>      What would it be the procedure to add some custom log files from
>>>>      different containers ( logs that are not shown in stdout ) to be
>>>>      delivered to elasticseach as well ?
>>>>      I two different clusters ( 3.7 and 3.9 ) up and running,  and i know
>>>>      that in 3.7 docker logging driver is configured with journald whilst
>>>>      in 3.9 is json-file.
>>>>      Any thoughts on this ?
>>>>      Thanks a lot !
>>>>      --     Best regards, Leo David
>>>> -- 
>>>> -- 
>>>> Jeff Cantrill
>>>> Senior Software Engineer, Red Hat Engineering
>>>> OpenShift Logging
>>>> Red Hat, Inc.
>>>> *Office*: 703-748-4420 | 866-546-8970 ext. 8162420
>>>> jcantril redhat com <mailto:jcantril redhat com>
>>>> http://www.redhat.com

[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]