Search Results

Search found 4908 results on 197 pages for 'ssas 2005'.

Page 101/197 | < Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >

  • Recommended ways to install USB drivers with a Visual Studio 2005 Setup Project?

    - by tjmoore
    I need to install a USB driver with an application, and I'm using a Visual Studio 2005 Setup Project to create the installer. The driver only needs to be installed sufficient enough so that when the USB device is plugged in, Windows will go off doing it's "installing device" routine and do the rest of the job. It would be okay also to have the setup finish and then the user connects the device when required with the driver install completing then. However the user shouldn't be prompted to find a driver location. The USB drivers I have are available either as plain .sys / .inf files, or as a full installer (.msi together with a setup.exe wrapper). The full installer deals with combinations of operating systems and languages, but the application is for internal use and I can limit the target OS to Windows XP. Would it be better just to run the available installer via a custom action, or to install via the .inf file somehow (I'm not sure how to do that)?

    Read the article

  • Is there a way to get an ASMX Web Service created in VS 2005 to receive and return JSON?

    - by Ben McCormack
    I'm using .NET 2.0 and Visual Studio 2005 to try to create a web service that can be consumed both as SOAP/XML and JSON. I read Dave Ward's Answer to the question How to return JSON from a 2.0 asmx web service (in addition to reading other articles at Encosia.com), but I can't figure out how I need to set up the code of my asmx file in order to work with JSON using jQuery. Two Questions: How do I enable JSON in my .NET 2.0 ASMX file? What's a simple jQuery call that could consume the service using JSON? Also, I notice that since I'm using .NET 2.0, I i'm not able to implement using System.Web.Script.Services.ScriptService. Here's my C# code for the demo ASMX service: using System; using System.Web; using System.Collections; using System.Web.Services; using System.Web.Services.Protocols; /// <summary> /// Summary description for StockQuote /// </summary> [WebService(Namespace = "http://tempuri.org/")] [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)] public class StockQuote : System.Web.Services.WebService { public StockQuote () { //Uncomment the following line if using designed components //InitializeComponent(); } [WebMethod] public decimal GetStockQuote(string ticker) { //perform database lookup here return 8; } [WebMethod] public string HelloWorld() { return "Hello World"; } } Here's a snippet of jQuery I found on the internet and tried to modify: $(document).ready(function(){ $("#btnSubmit").click(function(event){ $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", url: "http://bmccorm-xp/WebServices/HelloWorld.asmx", data: "", dataType: "json" }) event.preventDefault(); }); });

    Read the article

  • SQL Server 2005 - Enabling both Named Pipes & TCP/IP protocols?

    - by Clinemi
    We have a SQL Server 2005 database, and currently all our users are connecting to the database via the TCP/IP protocol. The SQL Server Configuration Manager allows you to "enable" both Named Pipes, and TCP/IP connections at the same time. Is this a good idea? My question is not whether we should use named pipes instead of TCP/IP, but are there problems associated with enabling both? One of our client's IT guys, says that enabling database communication with both protocols will limit the bandwidth that either protocol can use - to like 50% of the total. I would think that the bandwidth that TCP/IP could use would be directly tied (inversely) to the amount of traffic that Named Pipes (or any of the other types of traffic) were occupying on the network at that moment. However, this IT person is indicating that the fact that we have enabled two protocols on the server, artificially limits the bandwidth that TCP/IP can use. Is this correct? I did Google searches but could not come up with an answer to this question. Any help would be appreciated.

    Read the article

  • How can I set up port forwarding for SQL Server 2005?

    - by Manish
    Hello Subject :how to use port forwarding Internet------> Router in my network ------->LocalMachine (Windows 2003) -->Sqlserver2005 How can I access SQL Server through the internet via a router in the local network? My router IP Address is =192.168.1.86; My local machine which is connected to the router Ip Address is= 192.168.1.81 At port No=1433 tell me how to use port forwarding Thanks for help in advance

    Read the article

  • Which method of SQL Server 2005 or 2008 Replication is best for ease of field changes?

    - by Rick
    We need 15 minute warm updates from one SQL Server to another. Log Shipping looks good and appears easy to setup. We are also looking into Transactional Replication. The data only needs to copy one way. We have two main requirements: 1) The destination database needs to be a max 15 minute old copy of the source. It needs to re-try and get up-to-date if a network cable is unplugged for a while. 2) We would really like table (fields added or modified) changes in the source as easy as possible. Thanks in advance for all suggestions.

    Read the article

  • SQL Server 2005 Disk Configuration: Single RAID 1+0 or multiple RAID 1+0s?

    - by mfredrickson
    Assuming that the workload for the SQL Server is just a normal OLTP database, and that there are a total of 20 disks available, which configuration would make more sense? A single RAID 1+0, containing all 20 disks. This physical volume would contain both the data files and the transaction log files, but two logical drives would be created from this RAID: one for the data files and one for the log files. Or... Two RAID 1+0s, each containing 10 disks. One physical volume would contain the data files, and the other would contain the log files. The reason for this question is due to a disagreement between me (SQL Developer) and a co-worker (DBA). For every configuration that I've done, or seen others do, the data files and transaction log files were separated at the physical level, and were placed on separate RAIDs. However, my co-workers argument is that by placing all the disks into a single RAID 1+0, then any IO that is done by the server is potentially shared between all 20 disks, instead of just 10 disks in my suggested configuration. Conceptually, his argument makes sense to me. Also, I've found some information from Microsoft that seems to supports his position. http://technet.microsoft.com/en-us/library/cc966414.aspx In the section titled "3. RAID10 Configuration", showing a configuration in which all 20 disks are allocated to a single RAID 1+0, it states: In this scenario, the I/O parallelism can be used to its fullest by all partitions. Therefore, distribution of I/O workload is among 20 physical spindles instead of four at the partition level. But... every other configuration I've seen suggests physically separating the data and log files onto separate RAIDs. Everything I've found here on Server Fault suggests the same. I understand that a log files will be write heavy, and that data files will be a combination of reads and writes, but does this require that the files be placed onto separate RAIDs instead of a single RAID?

    Read the article

  • Welcome to the Red Gate BI Tools Team blog!

    - by BI Tools Team
    Welcome to the first ever post on the brand new Red Gate Business Intelligence Tools Team blog! About the team Nick Sutherland (product manager): After many years as a software developer and project manager, Nick took an MBA and turned to product marketing. SSAS Compare is his second lean startup product (the first being SQL Connect). Follow him on Twitter. David Pond (developer): Before he joined Red Gate in 2011, David made monitoring systems for Goodyear. Follow him on Twitter. Jonathan Watts (tester): Jonathan became a tester after finishing his media degree and joining Xerox. He joined Red Gate in 2004. Follow him on Twitter. James Duffy (technical author): After a spell as a writer in the video game industry, James lived briefly in Tokyo before returning to the UK to start at Red Gate. What we're working on We launched a beta of our first tool, SSAS Compare, last month. It works like SQL Compare but for SSAS cubes, letting you deploy just the changes you want. It's completely free (for now), so check it out. We're still working on it, and we're eager to hear what you think. We hope SSAS Compare will be the first of several tools Red Gate develops for BI professionals, so keep an eye out for more from us in the future. Why we need you This is your chance to help influence the course of SSAS Compare and our future BI tools. If you're a business intelligence specialist, we want to hear about the problems you face so we can build tools that solve them. What do you want to see? Tell us! We'll be posting more about SSAS Compare, business intelligence and our journey into BI in the coming days and weeks. Stay tuned!

    Read the article

  • Welcome to the Red Gate BI Tools Team blog!

    - by Red Gate Software BI Tools Team
    Welcome to the first ever post on the brand new Red Gate Business Intelligence Tools Team blog! About the team Nick Sutherland (product manager): After many years as a software developer and project manager, Nick took an MBA and turned to product marketing. SSAS Compare is his second lean startup product (the first being SQL Connect). Follow him on Twitter. David Pond (developer): Before he joined Red Gate in 2011, David made monitoring systems for Goodyear. Follow him on Twitter. Jonathan Watts (tester): Jonathan became a tester after finishing his media degree and joining Xerox. He joined Red Gate in 2004. Follow him on Twitter. James Duffy (technical author): After a spell as a writer in the video game industry, James lived briefly in Tokyo before returning to the UK to start at Red Gate. What we’re working on We launched a beta of our first tool, SSAS Compare, last month. It works like SQL Compare but for SSAS cubes, letting you deploy just the changes you want. It’s completely free (for now), so check it out. We’re still working on it, and we’re eager to hear what you think. We hope SSAS Compare will be the first of several tools Red Gate develops for BI professionals, so keep an eye out for more from us in the future. Why we need you This is your chance to help influence the course of SSAS Compare and our future BI tools. If you’re a business intelligence specialist, we want to hear about the problems you face so we can build tools that solve them. What do you want to see? Tell us! We’ll be posting more about SSAS Compare, business intelligence and our journey into BI in the coming days and weeks. Stay tuned!

    Read the article

  • Welcome to the Red Gate BI Tools Team blog!

    - by BI Tools Team
    Welcome to the first ever post on the brand new Red Gate Business Intelligence Tools Team blog! About the team Nick Sutherland (product manager): After many years as a software developer and project manager, Nick took an MBA and turned to product marketing. SSAS Compare is his second lean startup product (the first being SQL Connect). Follow him on Twitter. David Pond (developer): Before he joined Red Gate in 2011, David made monitoring systems for Goodyear. Follow him on Twitter. Jonathan Watts (tester): Jonathan became a tester after finishing his media degree and joining Xerox. He joined Red Gate in 2004. Follow him on Twitter. James Duffy (technical author): After a spell as a writer in the video game industry, James lived briefly in Tokyo before returning to the UK to start at Red Gate. What we're working on We launched a beta of our first tool, SSAS Compare, last month. It works like SQL Compare but for SSAS cubes, letting you deploy just the changes you want. It's completely free (for now), so check it out. We're still working on it, and we're eager to hear what you think. We hope SSAS Compare will be the first of several tools Red Gate develops for BI professionals, so keep an eye out for more from us in the future. Why we need you This is your chance to help influence the course of SSAS Compare and our future BI tools. If you're a business intelligence specialist, we want to hear about the problems you face so we can build tools that solve them. What do you want to see? Tell us! We'll be posting more about SSAS Compare, business intelligence and our journey into BI in the coming days and weeks. Stay tuned!

    Read the article

  • Microsoft Visual Studio Release History/Timelines/Milestones

    1975 – Bill Gates and Paul Allen write a version of Basic for Altair 8080 1982 – IBM releases BASCOM 1.0 (developed by Microsoft) 1983 – Microsoft Basic Compiler System v5.35 for MS-DOS release 1984 - Microsoft Basic Compiler System v5.36 release 1985 – Microsoft QuickBASIC 1.0 1986 – Microsoft QuickBASIC 1.01, 1.02, 2.00 1987 – Microsoft QuickBASIC 2.01, 3.00, 4.00 1987 – Microsoft BASIC 6.0 1988 – Microsoft QuickBASIC 4.00, 4.00b, 4.50 1989 – Microsoft BASIC Professional Development System 7.0 1990 - Microsoft BASIC Professional Development System 7.1 1991 – Microsoft Visual Basic released May 20-Windows World Convention –Atlanta 1992 – Microsoft Visual Basic 2.0 1993 – Microsoft Visual Basic 3.0 in Standard and Professional versions 1995 – Microsoft Visual Basic 4.0 released, supported the new Windows 95 1997 – Microsoft Visual Basic 5.0 – introduction of IntelliSense 1998 – Microsoft Visual Studio 6.0 that included Visual Basic 6.0 released (first VS) 2002 – Microsoft Visual Basic .NET 7.0 2002 – Visual Studio .NET 2003 – Microsoft Visual Basic .NET 7.1 2003 – Microsoft Visual Studio w/Intellisense 2003 – Visual Studio .NET 2004 – Announce Visual Studios 2005 – Code name Whidbey 2005 – Visual Studio 2005 release w/Extensibility 2005 – Visual Studio Express released 2006 - Expression Tool Set released - devs and designers work together 2006 – Visual Studio Team release – November 30th 2007 – Visual Studio 2008 (code name Orcas) ships November = Video Studio Shell 2010 - Visual Studios (code name Rosario) span.fullpost {display:none;}

    Read the article

  • Simulate ROW_NUMBER in SQL 2000

    - by Derek Dieter
    While the row_number feature in sql 2005+ has proven to be a very powerful feature, there are still ways to implement that same functionality in SQL Server 2000. Let’s first look at the SQL 2005+ implementation of ROW_NUMBER, then compare it to the SQL 2000:-- SQL 2005+ SELECT RowNumber = ROW_NUMBER() OVER (ORDER BY c.LastName ASC) ,c.LastName ,c.FirstName FROM [...]

    Read the article

  • Row Count Plus Transformation

    As the name suggests we have taken the current Row Count Transform that is provided by Microsoft in the Integration Services toolbox and we have recreated the functionality and extended upon it. There are two things about the current version that we thought could do with cleaning up Lack of a custom UI You have to type the variable name yourself In the Row Count Plus Transformation we solve these issues for you. Another thing we thought was missing is the ability to calculate the time taken between components in the pipeline. An example usage would be that you want to know how many rows flowed between Component A and Component B and how long it took. Again we have solved this issue. Credit must go to Erik Veerman of Solid Quality Learning for the idea behind noting the duration. We were looking at one of his packages and saw that he was doing something very similar but he was using a Script Component as a transformation. Our philosophy is that if you have to write or Copy and Paste the same piece of code more than once then you should be thinking about a custom component and here it is. The Row Count Plus Transformation populates variables with the values returned from; Counting the rows that have flowed through the path Returning the time in seconds between when it first saw a row come down this path and when it saw the final row. It is possible to leave both these boxes blank and the component will still work.   All input columns are passed through the transformation unaltered, you are not permitted to change or add to the inputs or outputs of this component. Optionally you can set the component to fire an event, which happens during the PostExecute phase of the execution. This can be useful to improve visibility of this information, such that it is captured in package logging, or can be used to drive workflow in the case of an error event. Properties Property Data Type Description OutputRowCountVariable String The name of the variable into which the amount of row read will be passed (Optional). OutputDurationVariable String The name of the variable into which the duration in seconds will be passed. (Optional). EventType RowCountPlusTransform.EventType The type of event to fire during post execute, included in which are the row count and duration values. RowCountPlusTransform.EventType Enumeration Name Value Description None 0 Do not fire any event. Information 1 Fire an Information event. Warning 2 Fire a Warning event. Error 3 Fire an Error event. Installation The component is provided as an MSI file which you can download and run to install it. This simply places the files on disk in the correct locations and also installs the assemblies in the Global Assembly Cache as per Microsoft’s recommendations. You may need to restart the SQL Server Integration Services service, as this caches information about what components are installed, as well as restarting any open instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. For 2005/2008 Only - Finally you will have to add the transformation to the Visual Studio toolbox manually. Right-click the toolbox, and select Choose Items.... Select the SSIS Data Flow Items tab, and then check the Row Count Plus Transformation in the Choose Toolbox Items window. This process has been described in detail in the related FAQ entry for How do I install a task or transform component? We recommend you follow best practice and apply the current Microsoft SQL Server Service pack to your SQL Server servers and workstations, and this component requires a minimum of SQL Server 2005 Service Pack 1. Downloads The Row Number Transformation is available for SQL Server 2005, SQL Server 2008 (includes R2) and SQL Server 2012. Please choose the version to match your SQL Server version, or you can install multiple versions and use them side by side if you have more than one version of SQL Server installed. Row Count Plus Transformation for SQL Server 2005 Row Count Plus Transformation for SQL Server 2008 Row Count Plus Transformation for SQL Server 2012 Version History SQL Server 2012 Version 3.0.0.6 - SQL Server 2012 release. Includes upgrade support for both 2005 and 2008 packages to 2012. (5 Jun 2012) SQL Server 2008 Version 2.0.0.5 - SQL Server 2008 release. (15 Oct 2008) SQL Server 2005 Version 1.1.0.43 - Bug fix for duration. For long running processes the duration second count may have been incorrect. (8 Sep 2006) Version 1.1.0.42 - SP1 Compatibility Testing. Added the ability to raise an event with the count and duration data for easier logging or workflow. (18 Jun 2006) Version 1.0.0.1 - SQL Server 2005 RTM. Made available as general public release. (20 Mar 2006) Screenshot Troubleshooting Make sure you have downloaded the version that matches your version of SQL Server. We offer separate downloads for SQL Server 2005, SQL Server 2008 and SQL Server 2012. If you get an error when you try and use the component along the lines of The component could not be added to the Data Flow task. Please verify that this component is properly installed.  ... The data flow object "Konesans ..." is not installed correctly on this computer, this usually indicates that the internal cache of SSIS components needs to be updated. This is held by the SSIS service, so you need restart the the SQL Server Integration Services service. You can do this from the Services applet in Control Panel or Administrative Tools in Windows. You can also restart the computer if you prefer. You may also need to restart any current instances of Business Intelligence Development Studio (BIDS) / Visual Studio that you may be using to build your SSIS packages. Once installation is complete you need to manually add the task to the toolbox before you will see it and to be able add it to packages - How do I install a task or transform component?

    Read the article

  • Rownum in SQL Server

    - by Derek Dieter
    Prior to SQL Server 2005, there was no inherent function to generate a rownumber within a row. There is a workaround however for SQL 2000. If you are on SQL 2005+, then you will utilize the following function:-- SQL 2005+ SELECT RowNumber = ROW_NUMBER() OVER (ORDER BY c.CustomerID ASC) [...]

    Read the article

  • Configuring SQL Server Management Studio to use Windows Integrated Authentication &hellip; from non-

    - by Enrique Lima
    Did you know you can pass your Windows credentials to SQL Server even when working from a workstation that is not joined to a domain? Here is how … From Start, then click All Programs, find Microsoft SQL Server (version 2005 or 2008). Once there, do a right-click on SQL Server Management Studio, then click on Properties Now, follow below to modify the entry for Target: Now the real task (we will be using the runas command) … Modify the shortcut’s target as follows, and remember to replace <domain\user> with the values that correspond to your environment : x64 SQL Server 2008 C:\Windows\System32\runas.exe /user:<domain\user> /netonly "C:\Program Files (x86)\Microsoft SQL Server\100\Tools\binn\VSShell\Common7\IDE\Ssms.exe -nosplash" SQL Server 2005 C:\Windows\System32\runas.exe /user:<domain\user> /netonly "C:\Program Files (x86)\Microsoft SQL Server\90\Tools\binn\VSShell\Common7\IDE\SqlWb.exe -nosplash" x86 SQL Server 2008 C:\Windows\System32\runas.exe /user:<domain\user> /netonly "C:\Program Files\Microsoft SQL Server\100\Tools\binn\VSShell\Common7\IDE\Ssms.exe -nosplash" SQL Server 2005 C:\Windows\System32\runas.exe /user:<domain\user> /netonly "C:\Program Files\Microsoft SQL Server\90\Tools\binn\VSShell\Common7\IDE\SqlWb.exe -nosplash" Since we modified the shortcut, we will need to fix the icon for SSMS.  We will fix it by pressing the Change Icon… button and pointing to the original “icon” providers. It is the executables for SSMS that hold the icon information, so we need to point to … x64 SQL Server 2008 %ProgramFiles% (x86)\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE\Ssms.exe SQL Server 2005 C:\Program Files (x86)\Microsoft SQL Server\90\Tools\binn\VSShell\Common7\IDE\SqlWb.exe x86 SQL Server 2008 %ProgramFiles%\Microsoft SQL Server\100\Tools\Binn\VSShell\Common7\IDE\Ssms.exe SQL Server 2005 C:\Program Files\Microsoft SQL Server\90\Tools\binn\VSShell\Common7\IDE\SqlWb.exe When you start SSMS from a modified shortcut, you’ll be prompted for your domain password: SSMS will show up stating a different account in the username box, but the parameters from the configuration you are doing above do work and will pass on correctly.

    Read the article

  • Importing Analysis Services 2008 KPI's in a PerformancePoint scorecard

    - by Colin
    I am trying to import a KPI from Analysis Services into a PerformancePoint Scorecard, and when I do, The Dashboard Designer throws an error: An unknown error has occurred. If the problem persists contact an administrator. There may be additional information in the server application event log. When I examine the event log, I find the following exception: System.IO.FileNotFoundException: Could not load file or assembly 'Microsoft.AnalysisServices, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. File name: 'Microsoft.AnalysisServices, Version=9.0.242.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' at Microsoft.PerformancePoint.Scorecards.Server.ImportExportHelper.GetImportableAsKpis(IBpm pmService, DataSource asDataSource) at Microsoft.PerformancePoint.Scorecards.Server.PmServer.GetImportableAsKpis(DataSource dataSource) I have found this thread which recommends reinstalling Microsoft ADOMD.NET but the installer for that won't run because the server already has a newer version of the product (The server is running SQL Server Analysis Services 2008 which includes Microsoft.AnalysisServices.AdomdClient.dll version 9.0.3042.0) Anyone have any ideas (short of finding the DLL myself and manually installing it to the GAC)?

    Read the article

  • mdx datediff error

    - by Roberto Durand
    select measures.name datediff("d", [Fecha].[Date].currentmember.member_value, [Dim Date].[Date].currentmember.member_value) on 1 from cube Error: Execution of the managed stored procedure datediff failed with the following error: Exception has been thrown by the target of an invocation.Argument 'Date1' cannot be converted to type 'Date' Is there any requirements to do datediff in mdx? In the dimension these member are defined as datetime, not sure if this influence in anyway the result...

    Read the article

  • In SQL Server Business Intelligence, why would I create a report model from an OLAP cube?

    - by ngm
    In Business Intelligence Developer Studio, I'm wondering why one would want to create a report model from an OLAP cube. As far as I understand it, OLAP cubes and report models are both business-oriented views of underlying structures (usually relational databases) that may not mean much to a business user. The cube is a multidimensional view in terms of dimensions and measures, and the report model is... well I'm not sure entirely -- is it a more business-oriented, but still essentially relational view? Anyway, in Report Builder I can connect directly to both an OLAP cube or a report model. So I don't see why, if I have an OLAP cube which already provides a business-oriented view of the data suitable for end-users, why I would then convert that to a report model and use that in Report Builder instead. I think I'm obviously missing some fundamental difference between report models and cubes -- any help appreciated!

    Read the article

  • MDX equivalent to SQL subqueries with aggregation

    - by James Lampe
    I'm new to MDX and trying to solve the following problem. Investigated calculated members, subselects, scope statements, etc but can't quite get it to do what I want. Let's say I'm trying to come up with the MDX equivalent to the following SQL query: SELECT SUM(netMarketValue) net, SUM(CASE WHEN netMarketValue > 0 THEN netMarketValue ELSE 0 END) assets, SUM(CASE WHEN netMarketValue < 0 THEN netMarketValue ELSE 0 END) liabilities, SUM(ABS(netMarketValue)) gross someEntity1 FROM ( SELECT SUM(marketValue) netMarketValue, someEntity1, someEntity2 FROM <some set of tables> GROUP BY someEntity1, someEntity2) t GROUP BY someEntity1 In other words, I have an account ledger where I hide internal offsetting transactions (within someEntity2), then calculate assets & liabilities after aggregating them by someEntity2. Then I want to see the grand total of those assets & liabilities aggregated by the bigger entity, someEntity1. In my MDX schema I'd presumably have a cube with dimensions for someEntity1 & someEntity2, and marketValue would be my fact table/measure. I suppose i could create another DSV that did what my subquery does (calculating net), and simply create a cube with that as my measure dimension, but I wonder if there is a better way. I'd rather not have 2 cubes (one for these net calculations and another to go to a lower level of granularity for other use cases), since it will be a lot of duplicate info in my database. These will be very large cubes.

    Read the article

  • Subselecting with MDX

    - by Vince
    Greetings stack overflow community. I've recently started building an OLAP cube in SSAS2008 and have gotten stuck. I would be grateful if someone could at least point me towards the right direction. Situation: Two fact tables, same cube. FactCalls holds information about calls made by subscribers, FactTopups holds topup data. Both tables have numerous common dimensions one of them being the Subscriber dimension. FactCalls             FactTopups SubscriberKey      SubscriberKey CallDuration         DateKey CallCost               Topup Value ... What I am trying to achieve is to be able to build FactCalls reports based on distinct subscribers that have topped up their accounts within the last 7 days. What I am basically looking for an MDX equivalent to SQL's: select * from FactCalls where SubscriberKey in ( select distinct SubscriberKey from FactTopups where ... ); I've tried creating a degenerate dimension for both tables containing SubscriberKey and doing: Exist( [Calls Degenerate].[Subscriber Key].Children, [Topups Degenerate].[Subscriber Key].Children ) Without success. Kind regards, Vince

    Read the article

  • Using MySQL as data source in Microsoft SQL Server Analysis Services

    - by coldilocks
    Hi, I have installed the latest .net connector (http://www.mysql.com/downloads/connector/net/), I can add MySQL databases as Data Sources, I can even browse through the data from Business Intelligence Studio. The problem is that I CANNOT create a datasource view, or if I do create one without tables, trying to add them after the fact gives me the same error. Specifically it looks like the data source view wizard tries to submit queries against the MySQL database using square brackets/braces, and the query bombs. I get an error message like: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '[my_db].[cheatType]' at line 2 So, in summary, has anyone been able to create a data source view using MySQL tables and, if so, can they please show me how this can be done. Thanks for any help!

    Read the article

  • Deploy only cube schema, without processing

    - by Ken Yao
    Is there a way to only deploy cube schema, but without processing the cube. It seems in Visual Studio, when yo deploy a cube, by default, it is "Deploy and Process". The problem is processing takes so much time, and my main purpose is just writing some MDX script and see if it works well against the cube structure. It seems processing whole cube is just over kill. So I ask.

    Read the article

< Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >