Oracle java popularity sliding relic12/14/2023 ![]() ![]() Only properties active on the production network can serve as DataStream destinations. Go to ☰ > CDN > Properties or just enter Properties in the search box.Ĭlick the Property Name link to go to the property you created.Īctivate the property on the production network. Set your Elasticsearch endpoint URL as the property hostname. We recommend choosing API Acceleration as the product. Go to Property Manager and create a new property. If you already monitor a property in DataStream, you cannot use it as a destination endpoint. Once the property hostname works as a destination endpoint, you cannot monitor it as a property in this or another stream. Using Akamaized hostnames as endpoints also requires enabling the Allow POST behavior in your property. That means only IP addresses that belong to your Akamaized property hostname can send logs to your custom destination. As a result, you can filter incoming traffic to your destination endpoint by IP addresses using the Origin IP Access List behavior. When you create a property with an Elasticsearch endpoint URL as hostname, this property acts as a proxy between the destination and DataStream. This destination supports using Akamaized hostnames as endpoints to send DataStream 2 logs for improved security. Click Validate & Save to validate the connection to the destination and save the details you provided.Akamai (allowed if using an Akamaized hostname as destination).The custom header name can contain the alphanumeric, dash, and underscore characters.ĭataStream 2 does not support custom header user values containing: ![]() Optionally, go to Custom header and enter the Custom header name and Custom header value.If you want to use mutual authentication, provide both the client certificate and the client key. Client key you want to use to authenticate to the backend server in the PEM (non-encrypted PKCS8) format.Client certificate in the PEM format that you want to use to authenticate requests to your destination.Enter the CA certificate in the PEM format for verification. DataStream requires a CA certificate, if you provide a self-signed certificate or a certificate signed by an unknown authority. CA certificate that you want to use to verify the origin server's certificate.If not provided, DataStream 2 fetches the hostname from the URL. TLS hostname matching the Subject Alternative Names (SANs) present in the SSL certificate for the endpoint URL.Optionally, click Additional options to add mTLS certificates for additional authentication.Optionally, change the Push frequency to receive bundled logs to your destination every 30, 60 or 90 seconds.Ĭlick Validate & Save to validate the connection to the destination and save the details you provided. If you want to send compressed gzip files to Elasticsearch, check the Send compressed data box. In Password, enter your Elasticsearch password you pass in the Authorization header of your requests that grants access to the space protected by basic access authentication. In Username, enter your Elasticsearch username you pass in the Authorization header of your requests that grants access to the space protected by basic access authentication. In Index name, enter the name of the index inside your Elasticsearch cluster where you want to send logs. See Connect to Elasticsearch if you want to retrieve your Elasticsearch endpoint URL. The endpoint URL should follow the format. In Endpoint, enter an Elasticsearch bulk endpoint URL you want to send your logs to. In Display name, enter a human-readable name description for the destination. ![]()
0 Comments
Leave a Reply.AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |