Audit logs in Red Hat Developer Hub
Tracking user activities, system events, and data changes with Red Hat Developer Hub audit logs
Abstract
Track user activities, system events, and data changes with audit logs to enhance security, automate compliance, and debug issues.
Audit logs are a chronological set of records documenting the user activities, system events, and data changes that affect your Red Hat Developer Hub users, administrators, or components. Administrators can view Developer Hub audit logs in the OpenShift Container Platform web console to monitor scaffolder events, changes to the RBAC system, and changes to the Catalog database. Audit logs include the following information:
- Name of the audited event
- Actor that triggered the audited event, for example, terminal, port, IP address, or hostname
- Event metadata, for example, date, time
-
Event status, for example,
success,failure -
Severity levels, for example,
info,debug,warn,error
You can use the information in the audit log to achieve the following goals:
- Enhance security
- Trace activities, including those initiated by automated systems and software templates, back to their source. Know when software templates are executed, and the details of application and component installations, updates, configuration changes, and removals.
- Automate compliance
- Use streamlined processes to view log data for specified points in time for auditing purposes or continuous compliance maintenance.
- Debug issues
- Use access records and activity details to fix issues with software templates or plugins.
Audit logs are not forwarded to the internal log store by default because this does not provide secure storage. You are responsible for ensuring that the system to which you forward audit logs is compliant with your organizational and governmental regulations, and is properly secured.
1. Configure audit logs for Developer Hub on OpenShift Container Platform
Configure logging deployment, log collector, and log forwarding components to enable audit logging for Developer Hub on OpenShift Container Platform.
Prerequisites
- You have access to the OpenShift Container Platform web console.
-
You have
cluster-adminprivileges.
Procedure
Configure the logging deployment, including both the CPU and memory limits for each logging component.
For more information, see Red Hat OpenShift Container Platform - Configuring your Logging deployment.
To configure the logging collector, configure the
spec.collectionstanza in theClusterLoggingcustom resource (CR) to use a supported modification to the log collector and collect logs fromSTDOUT.For more information, see Red Hat OpenShift Container Platform - Configuring the logging collector.
To configure log forwarding, send logs to specific endpoints inside and outside your OpenShift Container Platform cluster by specifying a combination of outputs and pipelines in a
ClusterLogForwarderCR.For more information, see Red Hat OpenShift Container Platform - Enabling JSON log forwarding and Red Hat OpenShift Container Platform - Configuring log forwarding.
2. Forward Red Hat Developer Hub audit logs to Splunk
Forward audit logs from Developer Hub to Splunk by using the OpenShift Logging Operator and a ClusterLogForwarder instance.
Prerequisites
- You have a cluster running on a supported OpenShift Container Platform version.
-
You have an account with
cluster-adminprivileges. - You have a Splunk Cloud account or Splunk Enterprise installation.
Procedure
- Log in to your OpenShift Container Platform cluster.
Install the OpenShift Logging Operator in the
openshift-loggingnamespace and switch to the namespace:$ oc project openshift-logging
Create a
serviceAccountnamedlog-collector:$ oc create sa log-collector
Bind the
collect-application-logsrole to theserviceAccount:$ oc create clusterrolebinding log-collector --clusterrole=collect-application-logs --serviceaccount=openshift-logging:log-collector
-
Generate a
hecTokenin your Splunk instance. Create a key/value secret in the
openshift-loggingnamespace and verify the secret:$ oc -n openshift-logging create secret generic splunk-secret --from-literal=hecToken=<HEC_Token>
$ oc -n openshift-logging get secret/splunk-secret -o yaml
Create a basic `ClusterLogForwarder`resource YAML file as follows:
apiVersion: logging.openshift.io/v1 kind: ClusterLogForwarder metadata: name: instance namespace: openshift-logging
For more information, see Creating a log forwarder.
Define the following
ClusterLogForwarderconfiguration using OpenShift web console or OpenShift CLI:Specify the
log-collectorasserviceAccountin the YAML file:serviceAccount: name: log-collector
Configure
inputsto specify the type and source of logs to forward. The following configuration enables the forwarder to capture logs from all applications in a provided namespace:inputs: - name: my-app-logs-input type: application application: includes: - namespace: my-rhdh-project containerLimit: maxRecordsPerSecond: 100For more information, see Forwarding application logs from specific pods.
Configure outputs to specify where to send the captured logs. In this step, focus on the
splunktype. You can either usetls.insecureSkipVerifyoption if the Splunk endpoint uses self-signed TLS certificates (not recommended) or provide the certificate chain using a Secret.outputs: - name: splunk-receiver-application type: splunk splunk: authentication: token: key: hecToken secretName: splunk-secret index: main url: 'https://my-splunk-instance-link' rateLimit: maxRecordsPerSecond: 250For more information, see Forwarding logs to Splunk in OpenShift Container Platform documentation.
Optional: Filter logs to include only audit logs:
filters: - name: audit-logs-only type: drop drop: - test: - field: .message notMatches: isAuditEventFor more information, see Filtering logs by content in OpenShift Container Platform documentation.
Configure pipelines to route logs from specific inputs to designated outputs. Use the names of the defined inputs and outputs to specify multiple
inputRefsandoutputRefsin each pipeline:Example
pipelinesconfigurationpipelines: - name: my-app-logs-pipeline detectMultilineErrors: true inputRefs: - my-app-logs-input outputRefs: - splunk-receiver-application filterRefs: - audit-logs-only
Run the following command to apply the
ClusterLogForwarderconfiguration:Example command to apply
ClusterLogForwarderconfiguration$ oc apply -f <ClusterLogForwarder-configuration.yaml>
Optional: To reduce the risk of log loss, configure your
ClusterLogForwarderpods using the following options:Define the resource requests and limits for the log collector as follows:
Example
collectorconfigurationcollector: resources: requests: cpu: 250m memory: 64Mi ephemeral-storage: 250Mi limits: cpu: 500m memory: 128Mi ephemeral-storage: 500MiDefine
tuningoptions for log delivery, includingdelivery,compression, andRetryDuration. You can apply tuning per output as needed.Example
tuningconfigurationtuning: delivery: AtLeastOnce compression: none minRetryDuration: 1s maxRetryDuration: 10s
AtLeastOnce-
The
AtLeastOncedelivery mode ensures that if the log forwarder crashes or restarts, the forwarder re-sends any logs read but not yet delivered to their destination. The forwarder might duplicate some logs after a crash.
Verification
- Verify that your Splunk instance receives logs by viewing them in the Splunk dashboard.
- Troubleshoot any issues using OpenShift Container Platform and Splunk logs as needed.
3. View audit logs in Developer Hub
You can view, search, filter, and manage audit log data directly from the Red Hat OpenShift Container Platform web console. To isolate these logs from other data types, filter your results by using the isAuditEvent field.
Prerequisites
- You are logged in as an administrator in the OpenShift Container Platform web console.
Procedure
- From the Developer perspective of the OpenShift Container Platform web console, click the Topology tab.
- From the Topology view, click the pod that you want to view audit log data for.
- From the pod panel, click the Resources tab.
- From the Pods section of the Resources tab, click View logs.
-
From the Logs view, enter
isAuditEventinto the Search field to filter audit logs from other log types. You can use the arrows to browse the logs containing theisAuditEventfield.
Additional resources