Search Results

Search found 14308 results on 573 pages for 'ssas 2008'.

Page 305/573 | < Previous Page | 301 302 303 304 305 306 307 308 309 310 311 312  | Next Page >

  • check compiler version in visual studio 2008

    - by snorlaks
    Hello, Im writing application in c++ and after try to run built (in debug mode) application on another machine I had error (The application has failed to start because its side-by-side configuration is incorect). I realised that there are missed DLLs from windows\WinSxS\ But I dont really know which folder contains what I really need and secondly I dont know how to check my compiler version in visual studio. Thanks for help

    Read the article

  • Visual studio 2008 unit test keeps failing

    - by Gerbrand
    I've create a method that calculates the harmonic mean based on a list of doubles. But when I'm running the test it keeps failing even thou the output result are the same. My harmonic mean method: public static double GetHarmonicMean(List<double> parameters) { var cumReciprocal = 0.0d; var countN = parameters.Count; foreach( var param in parameters) { cumReciprocal += 1.0d/param; } return 1.0d/(cumReciprocal/countN); } My test method: [TestMethod()] public void GetHarmonicMeanTest() { var parameters = new List<double> { 1.5d, 2.3d, 2.9d, 1.9d, 5.6d }; const double expected = 2.32432293165495; var actual = OwnFunctions.GetHarmonicMean(parameters); Assert.AreEqual(expected, actual); } After running the test the following message is showing: Assert.AreEqual failed. Expected:<2.32432293165495. Actual:<2.32432293165495. For me that are both the same values. Can somebody explain this? Or am I doing something wrong?

    Read the article

  • Import excel file into sql express 2008

    - by ck
    Hi, I've got some excel files, that were exported from tables in Access, and I want to import them into sql express 2005. I need a script that will convert nvarchar(255) columns to varchar(255) and preserve links, when importing the data into sql express. Thanks

    Read the article

  • VB 2008 - Index was outside the bounds of the array

    - by Jan
    Hey guys I'm having a problem while reading a config.cfg file of my program. I can read the 23. char of the file but I can't read the 24. char (last char in file). This is the code: Dim CFGReader2 As System.IO.StreamReader CFGReader2 = _ My.Computer.FileSystem.OpenTextFileReader(CurDir() & "\Config.cfg") Dim Server(2) As String Server(0) = CFGReader2.ReadToEnd.Chars(23)//This part works If Server(0) = 0 Then Server(1) = CFGReader2.ReadToEnd.Chars(24)//This part results in "Index was outside the bounds of the array". ElseIf Server(0) = 1 Then Server(2) = CFGReader2.ReadToEnd.Chars(24)//This part results in "Index was outside the bounds of the array". Server(1) = 10 + Server(2) ElseIf Server(0) = 2 Then Server(2) = CFGReader2.ReadToEnd.Chars(24)//This part results in "Index was outside the bounds of the array". Server(1) = 20 + Server(2) ElseIf Server(0) = 3 Then Server(2) = CFGReader2.ReadToEnd.Chars(24)//This part results in "Index was outside the bounds of the array". Server(1) = 30 + Server(2) End If And this is the file: Language = 2 Server = 11 Thanks for the answer! Frosty

    Read the article

  • Change path to save mysettings - VB.NET 2008

    - by yae
    Hi: I am using mysettings to save user settings. This config file is saved in this path: c:\ Documents and Settings \ \ [Local Settings] Application Data\\\ Is possible to change this path? For example, in my case I save app data in "ProgramData" folder (Vista & W7) and I would like save this config file in the same folder. Is possible? Thanks in advance

    Read the article

  • VB.NET 2008 Application crashing during Do Loop

    - by RedHaze
    I am writing an application to compare each item on listbox1 to all items on listbox2. If the item is found, then remove it from both lists. The goal is to only have the items that were not found remain on both lists. The problem is, the application just hangs and I never get any results. I looked at my code several times and I cannot figure out what's going on (programming noob I know...). Can anybody help me with this? Code Snippet: Private Sub Button2_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button2.Click Dim a As String Dim b As String Dim y As String For i As Integer = 0 To ListBox1.Items.Count - 1 a = ListBox1.Items(i) y = 1 Do While y = 1 For x As Integer = 0 To ListBox2.Items.Count - 1 b = ListBox2.Items(x) Dim res As Int16 = String.Compare(a, b) If res = 0 Then y = 0 ListBox2.Items.Remove(i) ListBox2.Items.Remove(x) ElseIf x = ListBox1.Items.Count Then Exit Do End If Next Loop Next End Sub

    Read the article

  • SQL 2008 CASE statement aggravation...

    - by Brad
    Why does this fail: DECLARE @DATE VARCHAR(50) = 'dasf' SELECT CASE WHEN ISDATE(@DATE) = 1 THEN CONVERT(date,@DATE) ELSE @DATE END Msg 241, Level 16, State 1, Line 2 Conversion failed when converting date and/or time from character string. Why is it trying to convert dasf to date when it clearly causes ISDATE(@DATE) = 1 to evaluate to false... If I do: SELECT ISDATE(@DATE) The return value is 0.

    Read the article

  • best practises to delete a set of tables in sql 2008

    - by Hari
    Basically i want to keep the transaction very simple but i should be able to rollback if any error in the later part. Something like mentioned below, BEGIN TRANSACTION DELETE SET 1(this will delete first set of table) COMMIT DELETE SET 2 (will delete second set of table) If any error occurs while deleting set 2 i should be able to rollback set 1 transaction as well.Let me know if we have any options to do like this. Appreciate for your help.

    Read the article

  • How to realize this in Transact SQL ( SQL Server 2008 )

    - by David
    I just want an update trigger like this postgresql version... It seems to me there is no NEW. and OLD. in MSSQL? CREATE OR REPLACE FUNCTION "public"."ts_create" () RETURNS trigger AS DECLARE BEGIN NEW.ctime := now(); RETURN NEW; END; Googled already, but nothing to be found unfortunately... Ideas?

    Read the article

  • SQL Server 2008 - Update a temporary table

    - by user336786
    Hello, I have stored procedure in which I am trying to retrieve the last ticket completed by each user listed in a comma-delimited string of usernames. The user may not have a ticket associated with them, in this case I know that i just need to return null. The two tables that I am working with are defined as follows: User ---- UserName, FirstName, LastName Ticket ------ ID, CompletionDateTime, AssignedTo, AssignmentDate, StatusID TicketStatus ------------ ID, Comments I have created a stored procedure in which I am trying to return the last completed ticket for a comma-delimited list of usernames. Each record needs to include the comments associated with it. Currently, I'm trying the following: CREATE TABLE #Tickets ( [UserName] nvarchar(256), [FirstName] nvarchar(256), [LastName] nvarchar(256), [TicketID] int, [DateCompleted] datetime, [Comments] text ) -- This variable is actually passed into the procedure DECLARE @userList NVARCHAR(max) SET @userList='user1,user2,user2' -- Obtain the user information for each user INSERT INTO #Tickets ( [UserName], [FirstName], [LastName] ) SELECT u.[UserName], u.[FirstName], u.[LastName] FROM User u INNER JOIN dbo.ConvertCsvToTable(@userList) l ON u.UserName=l.item At this point, I have the username, first and last name for each user passed in. However, I do not know how to actually get the last ticket completed for each of these users. How do I do this? I believe I should be updating the temp table I have created. At the same time, id do not know how to get just the last record in an update statement. Thank you!

    Read the article

  • What scenarios/settings will result in a query on SQL Server (2008) return stale data

    - by s1mm0t
    Most applications rarely need to display 100% accurate data. For example if this stack overflow question displays that there have been 0 views, when there have really been 10, it doesn't really matter. This is one way that the (perceived) performance of applications can be improved, by caching results and therefore sometimes not showing 100% accurate results. There are some cases where the data does need to be 100% accurate though. So if I run the query select * from Foo I want to be sure that the results are not stale. Now depending on how my database is set up, other activity on the database, use of transactions and isolation levels etc this query may or may not be a true reflection of the world. What scenario's and settings can people think of that will result in this query returning stale results or given that another connection is part way through a transaction that has updated this table, how can I guarantee that when the above query returns, the results will be accurate.

    Read the article

  • when creating a release version I get the following warnings (vs 2008 settings)

    - by djones2010
    warning lnk4075:ignoring /editandcontinue due to /opt:icp specification error lnk2005: initp+misc_invarg already defined in libcmtd.lib(invarg.obj) i have many more errors lnk2005 all int he libcmt.lib file in the invarg.obj also lnk2098:: defaultlib conflicts with use of other libs. when i had it as debug it was all working i just started to make a release and everything went south. could I get some help how to do the release version the lib i was using is a composite lib which was working with my test app. however before i do the final release i wanted to test the release version of my lib but when i include that into my test app i got the aforementioned errors

    Read the article

  • TSQL - MSSQL 2008 add a column and update it in same stored procedure

    - by TortTupper
    if I have a stored procedure say create procure w AS ALTER TABLE t ADD x char(1) UPDATE t set x =1 Even when it lets me create that stored procedure (if I create it when x exists), when it runs, there is an error on the UPDATE statement because column x doesn't exist. What's the conventional way to deal with this, it must come up all the time? I can work around it by putting the UPDATE inside EXEC, is there another/better way? Thanks

    Read the article

  • Meet SQLBI at PASS Summit 2012 #sqlpass

    - by Marco Russo (SQLBI)
    Next week I and Alberto Ferrari will be in Seattle at PASS Summit 2012. You can meet us at our sessions, at a book signing and hopefully watching some other session during the conference. Here are our appointments: Thursday, November 08, 2012, 10:15 AM - 11:45 AM – Alberto Ferrari – Room 606-607 Querying and Optimizing DAX (BIA-321-S) Do you want to learn how to write DAX queries and how to optimize them? Don’t miss this session! Thursday, November 08, 2012, 12:00 PM - 12:30 PM – Bookstore Book signing event at the Bookstore corner with Alberto Ferrari, Marco Russo and Chris Webb Visit the bookstore and sign your copy of our Microsoft SQL Server 2012 Analysis Services: The BISM Tabular Model book. Thursday, November 08, 2012, 1:30 PM - 2:45 PM – Marco Russo – Room 611 Near Real-Time Analytics with xVelocity (without DirectQuery) (BIA-312) What’s the latency you can tolerate for your data? Discover what is the limit in Tabular without using DirectQuery and learn how to optimize your data model and your queries for a near real-time analytical system. Not a trivial task, but more affordable than you might think. Friday, November 09, 2012, 9:45 AM - 11:00 AM Parent-Child Hierarchies in Tabular (BIA-301) Multidimensional has a more advanced support for hierarchies than Tabular, but in reality you can do almost the same things by using data modeling, DAX functions and BIDS Helper!  Friday, November 09, 2012, 1:00 PM - 2:15 PM – Marco Russo – Room 612 Inside DAX Query Plans (BIA-403) Discover the query plan for your DAX query and learn how to read it and how to optimize a DAX query by using these information. If you meet us at the conference, stop us and say hello: it’s always nice to know our readers!

    Read the article

  • DIVIDE vs division operator in #dax

    - by Marco Russo (SQLBI)
    Alberto Ferrari wrote an interesting article about DIVIDE performance in DAX. This new function has been introduced in SQL Server Analysis Services 2012 SP1, so it is available also in Excel 2013 (which still doesn’t have other features/fixes introduced by following Cumulative Updates…). The idea that instead of writing: IF ( Sales[Quantity] <> 0, Sales[Amount] / Sales[Quantity], BLANK () ) you can write: DIVIDE ( Sales[Amount], Sales[Quantity] ) There is a third optional argument in DIVIDE that defines the result in case the denominator (second argument) is zero, and by default its value is BLANK, so I omitted the third argument in my example. Using DIVIDE is very important, especially when you use a measure in MDX (for example in an Excel PivotTable) because it raise the chance that the non empty evaluation for the result is evaluated in bulk mode instead of cell-by-cell. However, from a DAX point of view, you might find it’s better to use the standard division operator removing the IF statement. I suggest you to read Alberto’s article, because you will find that an expression applying a filter using FILTER is faster than using CALCULATE, which is against any rule of thumb you might have read until now! Again, this is not always true, and depends on many conditions – trying to simplify, we might say that for a simple calculation, the query plan generated by FILTER could be more efficient – but, as usual, it depends, and 90% of the times using FILTER instead of CALCULATE produces slower performance. Do not take anything for granted, and always check the query plan when performance are your first issue!

    Read the article

  • Multidimensional Thinking–24 Hours of Pass: Celebrating Women in Technology

    - by smisner
    It’s Day 1 of #24HOP and it’s been great to participate in this event with so many women from all over the world in one long training-fest. The SQL community has been abuzz on Twitter with running commentary which is fun to watch while listening to the current speaker. If you missed the fun today because you’re busy with all that work you’ve got to do – don’t despair. All sessions are recorded and will be available soon. Keep an eye on the 24 Hours of Pass page for details. And the fun’s not over today. Rather than run 24 hours consecutively, #24HOP is now broken down into 12-hours over two days, so check out the schedule to see if there’s a session that interests you and fits your schedule. I’m pleased to announce that my business colleague Erika Bakse ( Blog | Twitter) will be presenting on Day 2 – her debut presentation for a PASS event. (And I’m also pleased to say she’s my daughter!) Multidimensional Thinking: The Presentation My contribution to this lineup of terrific speakers was Multidimensional Thinking. Here’s the abstract: “Whether you’re developing Analysis Services cubes or creating PowerPivot workbooks, you need to get into a multidimensional frame of mind to produce a model that best enables users to answer their business questions on their own. Many database professionals struggle initially with multidimensional models because the data modeling process is much different than the one they use to produce traditional, third normal form databases. In this session, I’ll introduce you to the terminology of multidimensional modeling and step through the process of translating business requirements into a viable model.” If you watched the presentation and want a copy of the slides, you can download a copy here. And you’re welcome to download the slides even if you didn’t watch the presentation, but they’ll make more sense if you did! Kimball All the Way There’s only so much I can cover in the time allotted, but I hope that I succeeded in my attempt to build a foundation that prepares you for starting out in business intelligence. One of my favorite resources that will get into much more detail about all kinds of scenarios (well beyond the basics!) is The Data Warehouse Toolkit (Second Edition) by Ralph Kimball. Anything from Kimball or the Kimball Group is worth reading. Kimball material might take reading and re-reading a few times before it makes sense. From my own experience, I found that I actually had to just build my first data warehouse using dimensional modeling on faith that I was going the right direction because it just didn’t click with me initially. I’ve had years of practice since then and I can say it does get easier with practice. The most important thing, in my opinion, is that you simply must prototype a lot and solicit user feedback, because ultimately the model needs to make sense to them. They will definitely make sure you get it right! Schema Generation One question came up after the presentation about whether we use SQL Server Management Studio or Business Intelligence Development Studio (BIDS) to build the tables for the dimensional model. My answer? It really doesn’t matter how you create the tables. Use whatever method that you’re comfortable with. But just so happens that it IS possible to set up your design in BIDS as part of an Analysis Services project and to have BIDS generate the relational schema for you. I did a Webcast last year called Building a Data Mart with Integration Services that demonstrated how to do this. Yes, the subject was Integration Services, but as part of that presentation, I showed how to leverage Analysis Services to build the tables, and then I showed how to use Integration Services to load those tables. I blogged about this presentation in September 2010 and included downloads of the project that I used. In the blog post, I explained that I missed a step in the demonstration. Oops. Just as an FYI, there were two more Webcasts to finish the story begun with the data – Accelerating Answers with Analysis Services and Delivering Information with Reporting Services. If you want to just cut to the chase and learn how to use Analysis Services to build the tables, you can see the Using the Schema Generation Wizard topic in Books Online.

    Read the article

< Previous Page | 301 302 303 304 305 306 307 308 309 310 311 312  | Next Page >