Executed CLI commands and NGINX logs to Chronicle

Hi,

I want to ingest 2 different types of logs from GCP to Chronicle SIEM.

1) executed commands on GCP projects from CLI and

2) Nginx logs which shows who reached which project bucket.

At the moment, I ingest GCP logs by native ingestion but executed commands doesn't be ingested. 

How can I collect these logs?  Thanks for your helpful responses.

Ismail Kaya 

0 3 370
3 REPLIES 3

So the CLI commands should exist in the Audit Logs, which can be natively ingested. If you're actually looking for a logged command, such as `gcloud projects list` then you'll have to understand what the logs show you. Meaning, the Audit Log will show you an event and activity, regardless if it was performed via CLI or via GUI. The log is the same b/c the GUI is just leveraging the API. 

 

 

 

{
  "protoPayload": {
    "@type": "type.googleapis.com/google.cloud.audit.AuditLog",
    "authenticationInfo": {
      "principalEmail": "email@domain.com"
    },
    "requestMetadata": {
      "callerIp": "REDACTED.RE.RE.RE",
      "requestAttributes": {},
      "destinationAttributes": {}
    },
    "serviceName": "admin.googleapis.com",
    "methodName": "google.admin.AdminService.changePassword",
    "resourceName": "organizations/123412341234/userSettings",
    "metadata": {
      "activityId": {
        "uniqQualifier": "-123123123123123123",
        "timeUsec": "1700680312053000"
      },
      "event": [
        {
          "eventId": "32123212321",
          "parameter": [
            {
              "name": "USER_EMAIL",
              "type": "TYPE_STRING",
              "label": "LABEL_OPTIONAL",
              "value": "USER@domain.com"
            }
          ],
          "eventName": "CHANGE_PASSWORD",
          "eventType": "USER_SETTINGS"
        }
      ],
      "@type": "type.googleapis.com/ccc_hosted_reporting.ActivityProto"
    }
  },
  "insertId": "123asd123asd",
  "resource": {
    "type": "audited_resource",
    "labels": {
      "method": "google.admin.AdminService.changePassword",
      "service": "admin.googleapis.com"
    }
  },
  "timestamp": "2023-11-22T19:11:52.053Z",
  "severity": "NOTICE",
  "logName": "organizations/123321123/logs/cloudaudit.googleapis.com%2Factivity",
  "receiveTimestamp": "2023-11-22T19:11:52.491646372Z"
}

 

I'm not sure where your NGINX logs originate from? Is this a NGINX VM running in GKE/GCE which logs to syslog and exports to Cloud Logging? If so, you can get those to ship to Chronicle natively. 

See here for some more information https://cloud.google.com/chronicle/docs/ingestion/cloud/ingest-gcp-logs


@minkaya wrote:

1) executed commands on GCP projects from CLI and


With CloudShell this by default isn't a VM that runs in your Org, and as a result the CLI audit logging does not get written back to your Operations logging, but the action of a command would be in GCP Audit Logging.

There is a feature I believe where you can either disable Cloud Shell (due to this lack of auditing) , or else you can use your own CloudShell image (where you could add auditing).


@minkaya wrote:

2) Nginx logs which shows who reached which project bucket.


Nginx logs in GCP would be a custom channel, and not ingested via the native GCP to Chronicle ingestion.  You'd need get the GCP Ops Agent on your GCE or GKE and write the logging to GCP Operations, then use a custom sink to GCS or PubSub (going into public preview soon), and use Feed Management to collect these logs.

Hi @cmmartin_google

I'm trying to accomplish this task too. 

Configured ops agent to send compute engine syslog and nginx logs to cloud logging but chronicle didn't accept export filter configuration for nginx logs. log_id ("nginx_access") and log_id("nginx_error")

why it's necessary to use a forwarder if all nginxs logs are already in cloud logging?

If I use cloud storage, is there any delay sending these nginx logs to chronicle?