Custom log ingestion (tables, DCR destination etc)#1241
Open
Ryan-Palmer wants to merge 22 commits intoCompositionalIT:masterfrom
Open
Custom log ingestion (tables, DCR destination etc)#1241Ryan-Palmer wants to merge 22 commits intoCompositionalIT:masterfrom
Ryan-Palmer wants to merge 22 commits intoCompositionalIT:masterfrom
Conversation
…ated the existing model to support Log Analytics.
… empty object if not specified otherwise the DCR Visualizer doesn't work.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This PR closes #1171
I've reverse-engineered the setup needed for log ingestion by setting up Serilog through the portal and inspecting the ARM.
See the full SAFE stack demo example with raw JSON resources here:
https://github.com/Ryan-Palmer/Safe-Serilog-Farmer-Demo
The changes in this PR are as follows:
_CLto the table names (as the portal does).Columntype used in Table and DCR configTransformKQLandOutputStreamproperties to the DataFlow configStreamDeclarationsto the DCR configStream declarations: Declaration of the different types of data sent into the Log Analytics workspace. Each stream is an object whose key represents the stream name, which must begin with Custom-
CustomStreamprinter to prependCustom-(as the portal does) and append_CL(as the portal requires). This allows passing of the plain table name. Could make a new caseCustomStreamAutoFormator something if this is likely to be an issue? Or just rely on people finding some obscure docs or experimenting in the portal to figure it out.?OsTypeoptional as setting it to any value causes the DCR to fail in the logging scenario (not sure about others?)I have read the contributing guidelines and have completed the following:
If I haven't completed any of the tasks above, I include the reasons why here:
Below is a minimal example configuration that includes the new features, which can be used to deploy to Azure: