It seems silly that Power BI Desktop can access files from an Azure Blob store but not an Azure File Share. Comments?
getting data from azure file share
Can't expand a custom field on JIRA
Hello everyone.
What I'm trying to do is extracting a custom value created on a Jira Project.
I have downloaded the pbit, changed the "= FetchPages("", 500)" to "= FetchPages("", 100)", inserted the URL, and, following a post,
changed the fourth query "Expanded Fields" from
= Table.ExpandRecordColumn(#"Expanded Column1", "fields",
{"issuetype", "timespent", "project", "fixVersions", "customfield_10110", "customfield_10111",
"aggregatetimespent", "resolution", "customfield_10112", "customfield_10113", "customfield_10114",
"customfield_10104", "customfield_10105", "customfield_10106", "customfield_10107",
"customfield_10108", "customfield_10109", "resolutiondate", "workratio", "lastViewed",
"watches", "created", "priority", "customfield_10100", "customfield_10101", "customfield_10102",
"customfield_10103", "labels", "timeestimate", "aggregatetimeoriginalestimate", "versions",
"issuelinks", "assignee", "updated", "status", "components", "timeoriginalestimate",
"description", "customfield_10006", "customfield_10009", "aggregatetimeestimate", "summary",
"creator", "subtasks", "reporter", "customfield_10000", "aggregateprogress", "customfield_10001",
"customfield_10004", "customfield_10115", "customfield_10116", "customfield_10117",
"environment", "customfield_10118", "customfield_10119", "duedate", "progress", "votes",
"parent", "customfield_10005", "customfield_10007", "customfield_10008", "customfield_10002",
"customfield_10003"},
{"issuetype", "timespent", "project", "fixVersions", "customfield_10110", "customfield_10111",
"aggregatetimespent", "resolution", "customfield_10112", "customfield_10113", "customfield_10114",
"customfield_10104", "customfield_10105", "customfield_10106", "customfield_10107",
"customfield_10108", "customfield_10109", "resolutiondate", "workratio", "lastViewed",
"watches", "created", "priority", "customfield_10100", "customfield_10101", "customfield_10102",
"customfield_10103", "labels", "timeestimate", "aggregatetimeoriginalestimate", "versions",
"issuelinks", "assignee", "updated", "status", "components", "timeoriginalestimate",
"description", "customfield_10006", "customfield_10009", "aggregatetimeestimate", "summary",
"creator", "subtasks", "reporter", "customfield_10000", "aggregateprogress", "customfield_10001",
"customfield_10004", "customfield_10115", "customfield_10116", "customfield_10117", "environment",
"customfield_10118", "customfield_10119", "duedate", "progress", "votes", "parent",
"customfield_10005", "customfield_10007", "customfield_10008", "customfield_10002",
"customfield_10003"})
to
= Table.ExpandRecordColumn(#"Expanded Column1", "fields",
{"issuetype", "timespent", "project", "fixVersions", "customfield_10110", "customfield_10111",
"aggregatetimespent", "resolution", "customfield_10112", "customfield_10113", "customfield_10114",
"customfield_10104", "customfield_10105", "customfield_10106", "customfield_10107",
"customfield_10108", "customfield_10109", "resolutiondate", "workratio", "lastViewed", "watches",
"created", "priority", "customfield_10100", "customfield_10101", "customfield_10102",
"customfield_10103", "labels", "timeestimate", "aggregatetimeoriginalestimate", "versions",
"issuelinks", "assignee", "updated", "status", "components", "timeoriginalestimate", "description",
"customfield_10006", "customfield_10009", "aggregatetimeestimate", "summary", "creator", "subtasks",
"reporter", "customfield_10000", "aggregateprogress", "customfield_10001", "customfield_10004",
"customfield_10115", "customfield_10116", "customfield_10117", "environment", "customfield_10118",
"customfield_10119", "duedate", "progress", "votes", "parent", "customfield_10005",
"customfield_10007", "customfield_10008", "customfield_10002", "customfield_10003",
"customfield_12021"},
{"issuetype", "timespent", "project", "fixVersions", "customfield_10110",
"customfield_10111", "aggregatetimespent", "resolution", "customfield_10112", "customfield_10113",
"customfield_10114", "customfield_10104", "customfield_10105", "customfield_10106",
"customfield_10107", "customfield_10108", "customfield_10109", "resolutiondate", "workratio",
"lastViewed", "watches", "created", "priority", "customfield_10100", "customfield_10101",
"customfield_10102", "customfield_10103", "labels", "timeestimate", "aggregatetimeoriginalestimate",
"versions", "issuelinks", "assignee", "updated", "status", "components", "timeoriginalestimate",
"description", "customfield_10006", "customfield_10009", "aggregatetimeestimate", "summary",
"creator", "subtasks", "reporter", "customfield_10000", "aggregateprogress", "customfield_10001",
"customfield_10004", "customfield_10115", "customfield_10116", "customfield_10117", "environment",
"customfield_10118", "customfield_10119", "duedate", "progress", "votes", "parent",
"customfield_10005", "customfield_10007", "customfield_10008", "customfield_10002",
"customfield_10003", "FieldToExtract"})
adding "customfield_12021" and "FieldToExtract" at the end, respectively.
I know for sure that some values of this field is different from "Blank" and "[Record]", but all I get to see, once ran the query, are these two values.
I thought that "[Record]" is a table that has to be expanded (it is so for other columns), but in the query editor that column cannot be expanded (and I can't find a [Record Value] either).
How do I extract that value?
What am I doing wrong?
How Jira pbit works withouth credentials
Hello everyone,
I have a question about how the pbit provided at the link Jira pbit model manages to download all the informations of the issues without having to input the user credentials.
The project isn't public, how does it work?
I don't remember to have ever set the credentials. Does it use Chrome saved password or something similar?
Executing SQL procs
Hello,
Is it not possible to EXEC SQL Server (SQL 2014) stored procs from PowerBI? I've been reading various posts on different forums and people are saying that I need to use OPENQUERY for this, is this this correct? I basically have 7 stored procs that all return tables that I want to display independanly on a a single dashboard. I want the stored procs to be called every time the dashboard refreshes so direct query seems to make sense, rather than import.
What I mean by "independanly" is that they should not have dependancies on each other, so slight follow up question, how do I stop dependancies, would it just be some form on panel?
Thanks
Get data from Dynamics NAV into PowerBI - Cant Connect
Good Morning!!
The process I do is:
- Open Power BI on the web
- Get app
- Dynamics Nav
- URL: https://recuperation.cloudapp.net:7048/NAV/OData/Company('Recuperation%20Electrolitos') (I guess it's correct)
- Auth - Basic
- I Put Username and password
I get the message: The data source credentials could not be updated: OData: Request failed: Unable to connect to the remote server
Could you help on this? Thanks!
Linking Table Error
I have 2 files I need to link by 2 columns. I uploaded links to sequel tables but everything needs to be linked by site and supplier ID
it won't let me do it. site (thousands of lines) and supplier ID (thousands of lines) so - many to many - but that isn't an option. The site / Supplier ID combination is unique. Is there a way to do this? (example - the same supplier ID could be used for a different supplier - by site) so I have to be able to link them together for reporting.
Connecting to Microsoft Teams using Power Bi
Hi folks,
Has anyone tried connecting up to Microsoft Teams using Power Bi to monitor message and phone call data. We currently monitor this with Skype for Business however with our recent integration of Teams is appears this is no longer possible unless we do a excel export.
Any ideas/tips would be greatly appreciated.
Thanks,
J
Schedule Refresh of Excel files in folder
Hi,
I am trying to build a model that should include data from Excel files in a folder. All files are structured the same but have data for different years. I have installed the Data Gateway and added the folder as a source, but when I try to schedule refresh on the model I get this message: "You can't schedule refresh for this dataset because one or more sources currently don't support refresh".
I have used a technique that I have previously used for merging data from multiple SQL databases into a single table.
This is my code:
let Source = Folder.Files("NameOfFolder"), MergeFolderFile = Table.AddColumn(Source, "Files", each [Folder Path] & [Name]), FilesToLoad = Table.Column(MergeFolderFile, "Files"), FilesLoop = (FilesToLoad as text) => let Source = Excel.Workbook(File.Contents(FilesToLoad), null, true), Sheet = Source{[Item="NameOfSheet",Kind="Sheet"]}[Data], PromotedHeaders = Table.PromoteHeaders(Sheet, [PromoteAllScalars=true]) in PromotedHeaders, LoadFiles = List.Transform(FilesToLoad, each FilesLoop(_)), CombineFiles = Table.Combine(LoadFiles)
in CombineFiles
It's it possible at all to schedule refresh of files in a folder? If it isn't it doesn't make sence that the gateway allows to me to add a folder as a source.
SharePoint - On-premises Gateway problem :( -> 0x80004005
Hi!
I have a SharePoint On-premises Gateway installed. Test connection is working.
But creating data source not!
My logs from the gateway:
DM.EnterpriseGateway Information: 0 : 2018-01-19T14:00:05.6614009Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 9BE2A3E3 [DataMovement.PipeLine.MashupCommon] Setting mashup connection test connection properties DM.EnterpriseGateway Information: 0 : 2018-01-19T14:00:29.9732736Z DM.EnterpriseGateway 23d3c960-078b-44e1-b2b0-a89be3de8117 1427e202-6daf-913b-eb01-2f34e6debbd6 MGPP 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b E8162E5D [DM.Pipeline.Common] Pool cleaner connections removed: 0, count: 0, buckets: 0 DM.EnterpriseGateway Information: 0 : 2018-01-19T14:00:29.9732736Z DM.EnterpriseGateway 23d3c960-078b-44e1-b2b0-a89be3de8117 1427e202-6daf-913b-eb01-2f34e6debbd6 MGPP 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b E8162E5D [DM.Pipeline.Common] Pool cleaner connections removed: 0, count: 0, buckets: 0 DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 727CA6CB [DM.Pipeline.Diagnostics] Exception object created [IsBenign=True]: Microsoft.PowerBI.DataMovement.Pipeline.Diagnostics.MashupDataAccessValueException: Mashup expression evaluation error. Reason: .; ErrorShortName: MashupDataAccessValueException[ErrorCode=-2147467259,HResult=-2147467259]/Wrapped(MashupValueException)[ErrorCode=-2147467259,HResult=-2147467259] DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 8AA61592 [DM.Pipeline.Diagnostics] Exception data: DM_ErrorDetailNameCode_UnderlyingErrorCode = <pi>-2147467259</pi> DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b FCABF693 [DM.Pipeline.Diagnostics] Exception data: DM_ErrorDetailNameCode_UnderlyingErrorMessage = <pi><pi>SharePoint: Fehler bei der Anforderung: Die Verbindung mit dem Remoteserver kann nicht hergestellt werden.</pi></pi> DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 6A5E25DC [DM.Pipeline.Diagnostics] Exception data: DM_ErrorDetailNameCode_UnderlyingHResult = <pi>-2147467259</pi> DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 955C2AD7 [DM.Pipeline.Diagnostics] Exception data: Microsoft.Data.Mashup.ValueError.DataSourceKind = <pi>SharePoint</pi> DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b F831CB2C [DM.Pipeline.Diagnostics] Exception data: Microsoft.Data.Mashup.ValueError.DataSourcePath = <pi>https://intra.REMOVED.com/</pi> DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 7BA84428 [DM.Pipeline.Diagnostics] Exception data: Microsoft.Data.Mashup.ValueError.Reason = <pi>DataSource.Error</pi> DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 30c96939-216d-4b72-8cac-e50eacf63821 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGTD 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 62E868B5 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCompletedWithFailureEvent (duration=47148, err=MashupDataAccessValueException, rootcauseErrorEventId=0) DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway 0aaa75a9-e06f-419e-a7da-94ceba8f739f cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGEC 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 28B20934 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCompletedWithFailureEvent (duration=47187, err=MashupDataAccessValueException, rootcauseErrorEventId=0) DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.5976987Z DM.EnterpriseGateway bd502b80-04e7-45be-8f3f-c2d26f598ea4 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGPP 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b C75628A1 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCompletedWithFailureEvent (duration=47247, err=MashupDataAccessValueException, rootcauseErrorEventId=0) DM.EnterpriseGateway Error: 0 : 2018-01-19T14:00:52.6289770Z DM.EnterpriseGateway 4d5e7953-b5fb-48f7-8ac6-0fb55d376205 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MDSR 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b D65B57F5 [DM.GatewayCore] Error processing request: [0]Microsoft.PowerBI.DataMovement.Pipeline.Diagnostics.MashupDataAccessValueException: Mashup expression evaluation error. Reason: . GatewayPipelineErrorCode=DM_GWPipeline_Gateway_MashupDataAccessError Reason= ---> [1]Microsoft.PowerBI.DataMovement.Pipeline.Diagnostics.GatewayPipelineWrapperException: Substituted: MashupValueException:<pi>Microsoft.Data.Mashup.MashupValueException (0x80004005): SharePoint: Fehler bei der Anforderung: Die Verbindung mit dem Remoteserver kann nicht hergestellt werden. bei Microsoft.Data.Mashup.ProviderCommon.MashupResource.StartEvaluationAndGetResultSource[T](Int32 timeout) bei Microsoft.Data.Mashup.MashupCommand.EvaluateAndGetSource[T](String commandText, CommandType commandType, Int32 commandTimeout, MashupParameterCollection parameters, String resultTransform, Boolean forColumnInfo, Boolean executeAction) bei Microsoft.Data.Mashup.MashupCommand.ExecuteReader(CommandBehavior commandBehavior, MashupCommandBehavior mashupCommandBehavior) bei Microsoft.Data.Mashup.MashupCommand.ExecuteReader() bei Microsoft.Data.Mashup.DataSourceReference.TestConnection(String connectionString) bei Microsoft.PowerBI.DataMovement.Pipeline.MashupCommon.MashupUtils.DSRTestConnectionAsync(String dsrJson, MashupCredential credential) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayDataAccess.MashupOleDbConnectionProvider.<TestConnectionAsync>d__3.MoveNext()</pi> GatewayPipelineErrorCode=DM_GWPipeline_UnknownError InnerType=MashupValueException InnerMessage=<pi>SharePoint: Fehler bei der Anforderung: Die Verbindung mit dem Remoteserver kann nicht hergestellt werden.</pi> InnerToString=<pi>Microsoft.Data.Mashup.MashupValueException (0x80004005): SharePoint: Fehler bei der Anforderung: Die Verbindung mit dem Remoteserver kann nicht hergestellt werden. bei Microsoft.Data.Mashup.ProviderCommon.MashupResource.StartEvaluationAndGetResultSource[T](Int32 timeout) bei Microsoft.Data.Mashup.MashupCommand.EvaluateAndGetSource[T](String commandText, CommandType commandType, Int32 commandTimeout, MashupParameterCollection parameters, String resultTransform, Boolean forColumnInfo, Boolean executeAction) bei Microsoft.Data.Mashup.MashupCommand.ExecuteReader(CommandBehavior commandBehavior, MashupCommandBehavior mashupCommandBehavior) bei Microsoft.Data.Mashup.MashupCommand.ExecuteReader() bei Microsoft.Data.Mashup.DataSourceReference.TestConnection(String connectionString) bei Microsoft.PowerBI.DataMovement.Pipeline.MashupCommon.MashupUtils.DSRTestConnectionAsync(String dsrJson, MashupCredential credential) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayDataAccess.MashupOleDbConnectionProvider.<TestConnectionAsync>d__3.MoveNext()</pi> InnerCallStack= bei Microsoft.Data.Mashup.ProviderCommon.MashupResource.StartEvaluationAndGetResultSource[T](Int32 timeout) bei Microsoft.Data.Mashup.MashupCommand.EvaluateAndGetSource[T](String commandText, CommandType commandType, Int32 commandTimeout, MashupParameterCollection parameters, String resultTransform, Boolean forColumnInfo, Boolean executeAction) bei Microsoft.Data.Mashup.MashupCommand.ExecuteReader(CommandBehavior commandBehavior, MashupCommandBehavior mashupCommandBehavior) bei Microsoft.Data.Mashup.MashupCommand.ExecuteReader() bei Microsoft.Data.Mashup.DataSourceReference.TestConnection(String connectionString) bei Microsoft.PowerBI.DataMovement.Pipeline.MashupCommon.MashupUtils.DSRTestConnectionAsync(String dsrJson, MashupCredential credential) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayDataAccess.MashupOleDbConnectionProvider.<TestConnectionAsync>d__3.MoveNext() ([1]Microsoft.PowerBI.DataMovement.Pipeline.Diagnostics.GatewayPipelineWrapperException.StackTracebei Microsoft.Data.Mashup.ProviderCommon.MashupResource.StartEvaluationAndGetResultSource[T](Int32 timeout) bei Microsoft.Data.Mashup.MashupCommand.EvaluateAndGetSource[T](String commandText, CommandType commandType, Int32 commandTimeout, MashupParameterCollection parameters, String resultTransform, Boolean forColumnInfo, Boolean executeAction) bei Microsoft.Data.Mashup.MashupCommand.ExecuteReader(CommandBehavior commandBehavior, MashupCommandBehavior mashupCommandBehavior) bei Microsoft.Data.Mashup.MashupCommand.ExecuteReader() bei Microsoft.Data.Mashup.DataSourceReference.TestConnection(String connectionString) bei Microsoft.PowerBI.DataMovement.Pipeline.MashupCommon.MashupUtils.DSRTestConnectionAsync(String dsrJson, MashupCredential credential) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayDataAccess.MashupOleDbConnectionProvider.<TestConnectionAsync>d__3.MoveNext() --- End of inner exception stack trace --- ([0]Microsoft.PowerBI.DataMovement.Pipeline.Diagnostics.MashupDataAccessValueException.StackTrace
bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayDataAccess.MashupOleDbConnectionProvider.<TestConnectionAsync>d__3.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.ConnectionStrings.AdoNetDbFullConnectionString.<TestConnectionAsync>d__19.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.ConnectionStrings.DbFullConnectionString.<TestConnectionAsync>d__23.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.GatewayProcessor.<>c__DisplayClass4_0.<<TestDataSourceConnection>b__0>d.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.Common.Diagnostics.PipelineTelemetryService.<ExecuteInActivity>d__7`1.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.GatewayProcessor.<TestDataSourceConnection>d__4.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.GatewayProcessor.<>c__DisplayClass5_0.<<EncryptCredentialsWithTestDataSourceConnection>b__0>d.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.Common.Diagnostics.PipelineTelemetryService.<ExecuteInActivity>d__7`1.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.GatewayProcessor.<EncryptCredentialsWithTestDataSourceConnection>d__5.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.GatewayProcessorDispatcher.<DispatchImpl>d__1.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.GatewayProcessorDispatcher.<Dispatch>d__0.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.Serialization.GatewayDeserializer.<DeserializeImpl>d__9.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.Serialization.GatewayDeserializer.<>c__DisplayClass8_0.<<Deserialize>b__0>d.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.Common.Diagnostics.PipelineTelemetryService.<ExecuteInActivity>d__7`1.MoveNext() --- Ende der Stapelüberwachung vom vorhergehenden Ort, an dem die Ausnahme ausgelöst wurde --- bei System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw() bei System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) bei Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.Serialization.GatewayDeserializer.<Deserialize>d__8.MoveNext() DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6289770Z DM.EnterpriseGateway 90c98b18-b507-4991-8936-ec036b8cf368 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MDSR 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b F5F1380C [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCorrelationEvent (parentActivityId=4d5e7953-b5fb-48f7-8ac6-0fb55d376205) DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6289770Z DM.EnterpriseGateway 90c98b18-b507-4991-8936-ec036b8cf368 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MDSR 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b BB5F1B73 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityStartedEvent () DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6289770Z DM.EnterpriseGateway 90c98b18-b507-4991-8936-ec036b8cf368 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MDSR 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 13F09589 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCompletedSuccessfullyEvent (duration=0) DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6289770Z DM.EnterpriseGateway 416209bf-ffb5-4045-ba45-b2e903a6e44b cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGPS 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 3C9C0373 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCorrelationEvent (parentActivityId=90c98b18-b507-4991-8936-ec036b8cf368) DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6289770Z DM.EnterpriseGateway 416209bf-ffb5-4045-ba45-b2e903a6e44b cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGPS 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b BB5F1B73 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityStartedEvent () DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6914498Z DM.EnterpriseGateway a9f6fa9e-4de9-4ce4-aab8-f778ca29b039 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MDSR 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 47306052 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCorrelationEvent (parentActivityId=416209bf-ffb5-4045-ba45-b2e903a6e44b) DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6914498Z DM.EnterpriseGateway a9f6fa9e-4de9-4ce4-aab8-f778ca29b039 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MDSR 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b BB5F1B73 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityStartedEvent () DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6914498Z DM.EnterpriseGateway a9f6fa9e-4de9-4ce4-aab8-f778ca29b039 cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MDSR 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 13F09589 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCompletedSuccessfullyEvent (duration=0) DM.EnterpriseGateway Verbose: 0 : 2018-01-19T14:00:52.6914498Z DM.EnterpriseGateway 416209bf-ffb5-4045-ba45-b2e903a6e44b cc9205a2-4dc4-f1ad-6144-e64d373b06b8 MGPS 00b4c0dd-5a44-bf01-ff7c-805fdc8c497b 42AAFDB0 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCompletedSuccessfullyEvent (duration=56)
Beginner: Gateway set up and server questions
Hello. I am have installed the Gateway software and am struggling. Unfortunately, I am the IT dept in my company. Can you please tell me if I am understanding the basic entries correctly:
a) Data Source Name: I entered a previously created Power BI Document I created
b) Data SOurce Type: SQL (the source is an online swiftpage ACT! Web Premium, but using a locally created remote database that synchronizes with it)
c) Server: My computer name
d) Database: Is the remote database that is on my deskop (as mentioned above, synchronizes with the database in the web)
e) Basic: I don't understand what the log in for this would be (Microsoft? Not Windows? My admin for the computer? The database?) -- I've tried them all I believe.
e) Windows: I've tried this authentication as well and receive the same message.
I had previously been connected but then tried a refresh and lost my connnections to the data so re-started. I want to launch the data for the company (tomorrow of course) so am trying to make a secure connection, but concerned that if I continue to do it incorrectly, I'll keep wasting time.
Tons of appreciation in advance!
Lisa
Power BI and BIM (IFC Standard)
Hello,
I´m trying to understand if there is any connection between Power BI and the IFC Standard?
http://www.buildingsmart-tech.org/specifications/ifc-overview/ifc-overview-summary
Which means, can someone please let me know if power BI cain conect data from any of the following types of file?
- .ifc
- .ifcXML
- .ifcZIP
Power BI and BIM formats
Hello All,
I´m trying to understand if there is any connection between Power BI and the IFC Standard?.
http://www.buildingsmart-tech.org/specifications/ifc-overview/ifc-overview-summary
Which means, can someone please let me know if power BI cain conect data from any of the following types of file?
- .ifc
- .ifcXML
- .ifcZIP
json data from URL with environmental information in request
Hi All,
I have created a HTTP json streamer running based on Active Directory authentication. It delivers personalised json data to Power BI Desktop (as a first step).
I'm looking for a way to include in the request (via querystring parameters or headers) some workbook variables from where the stream is consumed.
For example > a users has received an official pbix file, grabs the source url and uses this in another personal pbix file.
I'm looking for a way to get the workbook/file reference to be included in the request.
Could I use some environmental variables to link to the filename above in this let request?
Thanks in advance!
No Authorisation REST API after Publish Header
Hi all,
I have designed a dashboard within Power BI Desktop using our own data. This data is accessed using a REST API which is authorized using a token in the header. Everything works fine, but then I publish to Power BI Services and want to schedule the refresh. It shows a 400 error. It seems to be that the header is lost during publish?
I checked for hours on forums, but couldn't find any real solution. The only thing I found was this topic (https://community.powerbi.com/t5/Service/web-contents-with-specified-headers-works-in-PBI-desktop-but/td-p/22786/page/2) but without any solution. Can someone help me to fix this final piece?
Thanks in advance.
On-premise Gateway conection with DirectQuery
Hey!
I'm trying to connect a power bi dashboard with my SAP BW.
First, I'm trying to make a DirectQuery using a personal mode On-Premise Data Gateway.
For files which I did Import instead of DirectQuery the Gateway is working fine while open. The problem is when I'm trying to use DirectQuery.
Isn't it possible to make the connection through Personal Mode? In my case, the corporative gateway (as it was used to be called) is not possible. That means I won't be able to make a DirectQuery with SAP BW?
Help with queries that pull data from two or more tables into powerBI via ODBC connection
Hi all
OK so I am new to ODBC connections and was super excited to get it to firstly work and then pull data from a table. My ODBC connection uses a DSN from PostgreSQL. I am new to sql statements, as in very new. I can pull data from one table by doing the select * or selecting a few fields from one table but what I really need to do is pull data from 2 joined tables. Can anyone show me a little example of how to do this in PowerBI (I pressume this is done in the ODBC, SQL advanced area)? Basically if I have 2 tables table A and Table B which share a relationship via connection_id how do I do this? Also filtering of results, is this done once the data is in powerBI via the filter options?
Any help for this excited newbie much appreciated.
Facebook iOS app analytics integration
I'm trying to connect my Facebook mobile app analytics to Power BI. Can anyone please help me out? TIA!
JSON and Direct Query
Hi PBI Community,
I need some assistance, I am trying to parse JSON out of a column. The current connection type is Direct query to an SQL server. Unfortunatly because of this connection type i cannot use the "transform function" in the query editor to quickly parse out the JSON.
What options do i have while still remaining connected via direct query?
Data looks like
Error loading R codes in Power BI
I am having diffciculty loading R codes in Power BI
"Unable to translate bytes [ED][A0] at index 67 from specified code page to Unicode."
Any Solution, pls? Thanx
JSON API | Expression.Error: We cannot convert a value of type Record to type List.
Hi,
I'm trying to import some api data into powerbi and get the error;
Expression.Error: We cannot convert a value of type Record to type List.
Details:
Value=Record
Type=Type
The JSON extract from the api looks like this below and I can individually extract one record but not the whole model. I'm new to JSON and I'm confused how I am supposed to convert the whole dataset to a table?
[
{
"configuration_items" : [],
"new_attachments" : [],
"new_logs" : [],
"summary" : "123",
"last_update_at" : "2017-11-06 21:56:08+00",
"case_type" : "Request for Information",
"ticket_number" : "TK-QA-00014999",
"created_at" : "2017-11-06 21:56:08+00",
"priority" : "PRI-333",
"category" : "CTI-QA-00000016",
"ticket_state" : "Open - Needs Attention",
"external_ref" : "INC"
},
[
{
"configuration_items" : [],
"new_attachments" : [],
"new_logs" : [],
"summary" : "123",
"last_update_at" : "2017-11-06 21:56:08+00",
"case_type" : "Request for Information",
"ticket_number" : "TK-QA-00000009",
"created_at" : "2017-11-06 21:56:08+00",
"priority" : "PRI-333",
"category" : "CTI-QA-00000016",
"ticket_state" : "Open - Needs Attention",
"external_ref" : "INC"
}
]