Search Results

Search found 75233 results on 3010 pages for 'data distribution service'.

Page 193/3010 | < Previous Page | 189 190 191 192 193 194 195 196 197 198 199 200  | Next Page >

  • How to add a web service reference in a DLL

    - by dan
    I'm creating a DLL with a reference to web services (I don't have the choice to do so) but I have to add web service references to the project that uses the DLL for it to work. Example, I have the DLL called API.DLL that calls a web service called WebService.svc that I want to use in a project called WinForm. First, I have to add a "Service Reference" to WebService.svc in API.DLL. Then, I add a reference API.DLL to WinForm but it doesn't work unless I also add a service reference to WebService.svc in WinForm. What can I do to avoid that last step?

    Read the article

  • WCF service code in window application

    - by Mariya
    Hello, I am using C#.net application code. I require to call service for Window Application and i am using below code to open service Host, using (ServiceHost host = new ServiceHost( typeof(class1), new Uri[] { new Uri("net.pipe://localhost") }) ) { } & Then we have clinet Console application to connect to serviceHost. Problem is, When i create service/Client application Using Conslole Application both are working fine. But if i call servide code form Window application to connect to console client it gives Error for Binding Error like("No End Point/Address found to test") Can any one help me to run service from C# window application ? Thanks

    Read the article

  • Call AsyncTask methods from another class/service (callbacks?)

    - by TiGer
    Hi, I was wondering if it's possible to call specific methods defined within the AsynTask class from another class and/or service ? In my specific case I have a Service playing some sounds, but the sound is selected from a List with available sounds... When a sounds is selected it is downloaded from my home server, this takes some time (not much, let's say around the 3-4 seconds, the sounds/effects aren't big in size)... So my problem at the moment is that I have a service to play those sounds, and when I select one I wanted to show a progressdialog... The way (if I understood correctly) is to use an AsyncTask, but the only thing the AsyncTask will do is telling my Service to play a specific sound from my server... So there is no "callback" from the service to the Asynctask... How can I achieve that ? How can I call a running AsyncTask, which sits in another class, and tell him all work is done and thus he can stop showing the ProgressDialog ? Or am I over-engineering it and there are other ways ? Thanks in advance...

    Read the article

  • Obtaining XML from U.S. Postal Service (USPS) rate calculator API with PHP

    - by Chris F
    hoping somebody here can help me. I'm attempting to pull an XML page from the U.S. Postal Service (USPS) rate calculator, using PHP. Here is the code I am using (with my API login and password replaced of course): <? $api = "http://production.shippingapis.com/ShippingAPI.dll?API=RateV4&XML=<RateV4Request ". "USERID=\"MYUSERID\" PASSWORD=\"MYPASSWORD\"><Revision/><Package ID=\"1ST\">". "<Service>FIRST CLASS</Service><FirstClassMailType>PARCEL</FirstClassMailType>". "<ZipOrigination>12345</ZipOrigination><ZipDestination>54321</ZipDestination>". "<Pounds>0</Pounds><Ounces>9</Ounces><Container/><Size>REGULAR</Size></Package></RateV4Request>"; $xml_string = file_get_contents($api); $xml = simplexml_load_string($xml_string); ?> Pretty straightforward. However it never returns anything. I can paste the URL directly into my browser's address bar: http://production.shippingapis.com/ShippingAPI.dll?API=RateV4&XML=<RateV4RequestUSERID="MYUSERID" PASSWORD="MYPASSWORD"><Revision/><Package ID="1ST"><Service>FIRST CLASS</Service><FirstClassMailType>PARCEL</FirstClassMailType><ZipOrigination>12345</ZipOrigination><ZipDestination>54321</ZipDestination><Pounds>0</Pounds><Ounces>9</Ounces><Container/><Size>REGULAR</Size></Package></RateV4Request> And it returns the XML I need, so I know the URL is valid. But I cannot seem to capture it using PHP. Any help would be tremendously appreciated. Thanks in advance.

    Read the article

  • where to store web service exceptions?

    - by ICoder
    Hello all, I am working on building a web service (using c#) and this web service will use MS Sql server database. Now, I am trying to build an (exceptions log system) for this web service. Simply, I want to save every exception on the web service for future use (bug tracing), so, where is the best place to save these exceptions? Is it good idea to save it in the database? What if the exception is in the connection to the database itself? I really appreciate your help and your ideas. Thanks

    Read the article

  • Windows Service on NetworkService account can't access remote (shared) directory

    - by Aetius
    I'm trying to remotely access a shared folder with a Windows Service set to NetworkService account permissions. However, I get errors when I try to do this, e.g. Directory.Exists(servicePath) returns false, FileSystemWatcher doesn't recognize activity in the directory. If I change the service's account to LocalSystem, these methods work. I don't want to give the service root-level access, though. It seems to be a permissions problem, so how can I give the service permission to access the directory and monitor it?

    Read the article

  • Multiple WCF Services implementing same Service Contract interface

    - by andrewczwu
    Is it possible for multiple wcf services to implement the same service contract interface? What I want to do is allow for a test service to be interchangeable for the real service, and to specify which service to be used in the configuration file. For example: [ServiceContract] public interface IUselessService { [OperationContract] string GetData(int value); } Test implementation public class TestService : IUselessService { public string GetData(int value) { return "This is a test"; } } Real class public class RealService : IUselessService { public string GetData(int value) { return string.Format("You entered: {0}", value); } }

    Read the article

  • Web Service Security java

    - by WhoAmI
    My web service was created some time back using IBM JAX-RPC. As a part of enhancement, I need to provide some security to the existing service. One way is to provide a handler, all the request and response will pass through that handler only. In the request I can implement some authentication rules for each and every application/user accessing it. Other than this, What are the possible ways for securing it? I have heard someting called wsse security for web service. Is it possible to implement it for the JAX-RPC? Or it can be implemented only for JAX-WS? Need some helpful inputs on the wsse security so that i can jump learning it. Other than handler and wsse security, any other possible way to make a service secure? Please help.

    Read the article

  • Windows Service is not Working.

    - by prateeksaluja20
    I had made a windows service in visual studio 2008 in C#.inside the service i had written only single line code try { System.Diagnostics.Process.Start(@"E:\Users\Sk\Desktop\category.txt"); } catch { } then i add the project insatller & change the serviceProcessInstaller1 Account proerty as local system Also change the serviceInstaller1 start type proerty as Automatic. then i build the project.it was succesfull. after that i add another project that was setup project.i had added pprimary project output & i had added the custom action as "Primary output from DemoWindowsService (Active)".then buil the setup.setup was build sucessfully.then i install the setup & then went to services start the service.service stated properly butit was not performing the task. i had checked the path is correct & also i tried to do System.Diagnostics.Process.Start(@"E:\Windows\system32\notepad.exe") but still result is same.i tried alot but not getting the ans soleasse help me to solve this problem.

    Read the article

  • SQL SERVER – SHRINKFILE and TRUNCATE Log File in SQL Server 2008

    - by pinaldave
    Note: Please read the complete post before taking any actions. This blog post would discuss SHRINKFILE and TRUNCATE Log File. The script mentioned in the email received from reader contains the following questionable code: “Hi Pinal, If you could remember, I and my manager met you at TechEd in Bangalore. We just upgraded to SQL Server 2008. One of our jobs failed as it was using the following code. The error was: Msg 155, Level 15, State 1, Line 1 ‘TRUNCATE_ONLY’ is not a recognized BACKUP option. The code was: DBCC SHRINKFILE(TestDBLog, 1) BACKUP LOG TestDB WITH TRUNCATE_ONLY DBCC SHRINKFILE(TestDBLog, 1) GO I have modified that code to subsequent code and it works fine. But, are there other suggestions you have at the moment? USE [master] GO ALTER DATABASE [TestDb] SET RECOVERY SIMPLE WITH NO_WAIT DBCC SHRINKFILE(TestDbLog, 1) ALTER DATABASE [TestDb] SET RECOVERY FULL WITH NO_WAIT GO Configuration of our server and system is as follows: [Removed not relevant data]“ An email like this that suddenly pops out in early morning is alarming email. Because I am a dead, busy mind, so I had only one min to reply. I wrote down quickly the following note. (As I said, it was a single-minute email so it is not completely accurate). Here is that quick email shared with all of you. “Hi Mr. DBA [removed the name] Thanks for your email. I suggest you stop this practice. There are many issues included here, but I would list two major issues: 1) From the setting database to simple recovery, shrinking the file and once again setting in full recovery, you are in fact losing your valuable log data and will be not able to restore point in time. Not only that, you will also not able to use subsequent log files. 2) Shrinking file or database adds fragmentation. There are a lot of things you can do. First, start taking proper log backup using following command instead of truncating them and losing them frequently. BACKUP LOG [TestDb] TO  DISK = N'C:\Backup\TestDb.bak' GO Remove the code of SHRINKING the file. If you are taking proper log backups, your log file usually (again usually, special cases are excluded) do not grow very big. There are so many things to add here, but you can call me on my [phone number]. Before you call me, I suggest for accuracy you read Paul Randel‘s two posts here and here and Brent Ozar‘s Post here. Kind Regards, Pinal Dave” I guess this post is very much clear to you. Please leave your comments here. As mentioned, this is a very huge subject; I have just touched a tip of the ice-berg and have tried to point to authentic knowledge. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Backup and Restore, SQL Data Storage, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Sun Fire X4270 M3 SAP Enhancement Package 4 for SAP ERP 6.0 (Unicode) Two-Tier Standard Sales and Distribution (SD) Benchmark

    - by Brian
    Oracle's Sun Fire X4270 M3 server achieved 8,320 SAP SD Benchmark users running SAP enhancement package 4 for SAP ERP 6.0 with unicode software using Oracle Database 11g and Oracle Solaris 10. The Sun Fire X4270 M3 server using Oracle Database 11g and Oracle Solaris 10 beat both IBM Flex System x240 and IBM System x3650 M4 server running DB2 9.7 and Windows Server 2008 R2 Enterprise Edition. The Sun Fire X4270 M3 server running Oracle Database 11g and Oracle Solaris 10 beat the HP ProLiant BL460c Gen8 server using SQL Server 2008 and Windows Server 2008 R2 Enterprise Edition by 6%. The Sun Fire X4270 M3 server using Oracle Database 11g and Oracle Solaris 10 beat Cisco UCS C240 M3 server running SQL Server 2008 and Windows Server 2008 R2 Datacenter Edition by 9%. The Sun Fire X4270 M3 server running Oracle Database 11g and Oracle Solaris 10 beat the Fujitsu PRIMERGY RX300 S7 server using SQL Server 2008 and Windows Server 2008 R2 Enterprise Edition by 10%. Performance Landscape SAP-SD 2-Tier Performance Table (in decreasing performance order). SAP ERP 6.0 Enhancement Pack 4 (Unicode) Results (benchmark version from January 2009 to April 2012) System OS Database Users SAPERP/ECCRelease SAPS SAPS/Proc Date Sun Fire X4270 M3 2xIntel Xeon E5-2690 @2.90GHz 128 GB Oracle Solaris 10 Oracle Database 11g 8,320 20096.0 EP4(Unicode) 45,570 22,785 10-Apr-12 IBM Flex System x240 2xIntel Xeon E5-2690 @2.90GHz 128 GB Windows Server 2008 R2 EE DB2 9.7 7,960 20096.0 EP4(Unicode) 43,520 21,760 11-Apr-12 HP ProLiant BL460c Gen8 2xIntel Xeon E5-2690 @2.90GHz 128 GB Windows Server 2008 R2 EE SQL Server 2008 7,865 20096.0 EP4(Unicode) 42,920 21,460 29-Mar-12 IBM System x3650 M4 2xIntel Xeon E5-2690 @2.90GHz 128 GB Windows Server 2008 R2 EE DB2 9.7 7,855 20096.0 EP4(Unicode) 42,880 21,440 06-Mar-12 Cisco UCS C240 M3 2xIntel Xeon E5-2690 @2.90GHz 128 GB Windows Server 2008 R2 DE SQL Server 2008 7,635 20096.0 EP4(Unicode) 41,800 20,900 06-Mar-12 Fujitsu PRIMERGY RX300 S7 2xIntel Xeon E5-2690 @2.90GHz 128 GB Windows Server 2008 R2 EE SQL Server 2008 7,570 20096.0 EP4(Unicode) 41,320 20,660 06-Mar-12 Complete benchmark results may be found at the SAP benchmark website http://www.sap.com/benchmark. Configuration and Results Summary Hardware Configuration: Sun Fire X4270 M3 2 x 2.90 GHz Intel Xeon E5-2690 processors 128 GB memory Sun StorageTek 6540 with 4 * 16 * 300GB 15Krpm 4Gb FC-AL Software Configuration: Oracle Solaris 10 Oracle Database 11g SAP enhancement package 4 for SAP ERP 6.0 (Unicode) Certified Results (published by SAP): Number of benchmark users: 8,320 Average dialog response time: 0.95 seconds Throughput: Fully processed order line: 911,330 Dialog steps/hour: 2,734,000 SAPS: 45,570 SAP Certification: 2012014 Benchmark Description The SAP Standard Application SD (Sales and Distribution) Benchmark is a two-tier ERP business test that is indicative of full business workloads of complete order processing and invoice processing, and demonstrates the ability to run both the application and database software on a single system. The SAP Standard Application SD Benchmark represents the critical tasks performed in real-world ERP business environments. SAP is one of the premier world-wide ERP application providers, and maintains a suite of benchmark tests to demonstrate the performance of competitive systems on the various SAP products. See Also SAP Benchmark Website Sun Fire X4270 M3 Server oracle.com OTN Oracle Solaris oracle.com OTN Oracle Database 11g Release 2 Enterprise Edition oracle.com OTN Disclosure Statement Two-tier SAP Sales and Distribution (SD) standard SAP SD benchmark based on SAP enhancement package 4 for SAP ERP 6.0 (Unicode) application benchmark as of 04/11/12: Sun Fire X4270 M3 (2 processors, 16 cores, 32 threads) 8,320 SAP SD Users, 2 x 2.90 GHz Intel Xeon E5-2690, 128 GB memory, Oracle 11g, Solaris 10, Cert# 2012014. IBM Flex System x240 (2 processors, 16 cores, 32 threads) 7,960 SAP SD Users, 2 x 2.90 GHz Intel Xeon E5-2690, 128 GB memory, DB2 9.7, Windows Server 2008 R2 EE, Cert# 2012016. IBM System x3650 M4 (2 processors, 16 cores, 32 threads) 7,855 SAP SD Users, 2 x 2.90 GHz Intel Xeon E5-2690, 128 GB memory, DB2 9.7, Windows Server 2008 R2 EE, Cert# 2012010. Cisco UCS C240 M3 (2 processors, 16 cores, 32 threads) 7,635 SAP SD Users, 2 x 2.90 GHz Intel Xeon E5-2690, 128 GB memory, SQL Server 2008, Windows Server 2008 R2 DE, Cert# 2012011. Fujitsu PRIMERGY RX300 S7 (2 processors, 16 cores, 32 threads) 7,570 SAP SD Users, 2 x 2.90 GHz Intel Xeon E5-2690, 128 GB memory, SQL Server 2008, Windows Server 2008 R2 EE, Cert# 2012008. HP ProLiant DL380p Gen8 (2 processors, 16 cores, 32 threads) 7,865 SAP SD Users, 2 x 2.90 GHz Intel Xeon E5-2690, 128 GB memory, SQL Server 2008, Windows Server 2008 R2 EE, Cert# 2012012. SAP, R/3, reg TM of SAP AG in Germany and other countries. More info www.sap.com/benchmark

    Read the article

  • Want a headless build server for SSDT without installing Visual Studio? You’re out of luck!

    - by jamiet
    An issue that regularly seems to rear its head on my travels is that of headless build servers for SSDT. What does that mean exactly? Let me give you my interpretation of it. A SQL Server Data Tools (SSDT) project incorporates a build process that will basically parse all of the files within the project and spit out a .dacpac file. Where an organisation employs a Continuous Integration process they will likely want to automate the building of that dacpac whenever someone commits a change to the source control repository. In order to do that the organisation will use a build server (e.g. TFS, TeamCity, Jenkins) and hence that build server requires all the pre-requisite software that understands how to build an SSDT project. The simplest way to install all of those pre-requisites is to install SSDT itself however a lot of folks don’t like that approach because it installs a lot unnecessary components on there, not least Visual Studio itself. Those folks (of which i am one) are of the opinion that it should be unnecessary to install a heavyweight GUI in order to simply get a few software components required to do something that inherently doesn’t even need a GUI. The phrase “headless build server” is often used to describe a build server that doesn’t contain any heavyweight GUI tools such as Visual Studio and is a desirable state for a build server. In his blog post Headless MSBuild Support for SSDT (*.sqlproj) Projects Gert Drapers outlines the steps necessary to obtain a headless build server for SSDT: This article describes how to install the required components to build and publish SQL Server Data Tools projects (*.sqlproj) using MSBuild without installing the full SQL Server Data Tool hosted inside the Visual Studio IDE. http://sqlproj.com/index.php/2012/03/headless-msbuild-support-for-ssdt-sqlproj-projects/ Frankly however going through these steps is a royal PITA and folks like myself have longed for Microsoft to support headless build support for SSDT by providing a distributable installer that installs only the pre-requisites for building SSDT projects. Yesterday in MSDN forum thread Building a VS2013 headless build server - it's sooo hard Mike Hingley complained about this very thing and it prompted a response from Kevin Cunnane from the SSDT product team: The official recommendation from the TFS / Visual Studio team is to install the version of Visual Studio you use on the build machine. I, like many others, would rather not have to install full blown Visual Studio and so I asked: Is there any chance you'll ever support any of these scenarios: Installation of all build/deploy pre-requisites without installing the VS shell? TFS shipping with all of the pre-requisites for doing SSDT project build/deploys 3rd party build servers (e.g. TeamCity) shipping with all of the requisites for doing SSDT project build/deploys I have to say that the lack of a single installer containing all the pre-requisites for SSDT build/deploy puzzles me. Surely the DacFX installer would be a perfect vehicle for that? Kevin replied again: The answer is no for all 3 scenarios. We looked into this issue, discussed it with the Visual Studio / TFS team, and in the end agreed to go with their latest guidance which is to install Visual Studio (e.g. VS2013 Express for Web) on the build machine. This is how Visual Studio Online is doing it and it's the approach recommended for customers setting up their own TFS build servers. I would hope this is compatible with 3rd party build servers but have not verified whether this works with TeamCity etc. Note that DacFx MSI isn't a suitable release vehicle for this as we don't want to include Visual Studio/MSBuild dependencies in that package. It's meant to just include the core DacFx DLLs used by SSMS, SqlPackage.exe on the command line, etc. What this means is we won't be providing a separate MSI installer or nuget package with just the necessary build DLLs you need to run your build and tests. If someone wanted to create a script that generated a nuget package based on our DLLs and targets files, then release that somewhere on the web for easier integration with 3rd party build servers we've no problem with that. Again, here’s the link to the thread and its worth reading in its entirety if this is something that interests you. So there you have it. Microsoft will not be be providing support for headless build servers for SSDT but if someone in the community wants to go ahead and roll their own, go right ahead. @Jamiet

    Read the article

  • SQL – Quick Start with Admin Sections of NuoDB – Manage NuoDB Database

    - by Pinal Dave
    In the yesterday’s blog post we have seen that it is extremely easy to install the NuoDB database on your local machine. Now that the application is properly set up, let us explore NuoDB a bit more and get you familiar with the how it works and what the important areas of the NuoDB are that you should learn. As we have already installed NuoDB, now we will quickly start with two of the important areas in NuoDB: 1) Admin and 2) Explorer. In this blog post I will explore how the Admin Section of the NuoDB Console works.  In the next blog post we will learn how the Explorer Section works. Let us go to the NuoDB Console by typing the following URL in your browser: http://localhost:8080/ It will bring you to the following screen: On this screen you can see a big Start QuickStart button. Click on the button and it will bring you to following screen. On this screen you will find very important information about Domain and Database Settings. It is our habit that we do not read what is written on the screen and keep on clicking on continue without reading. While we are familiar with most wizards, we can often miss the very important message on the screen. Please note the information of Domain Settings and Database Settings from the following screen before clicking on Create Database. Domain Settings User: quickstart Password: quickstart Database Settings User: dba Password: goalie Database: test Schema: HOCKEY Once you click on the Create Database button it will immediately start creating sample database. First, it will start a Storage Manager and right after that it will start a Transaction Engine. Once the engine is up, it will Create a Schema and Sample Data. On the success of the creating the sample database it will show the following screen. Now is the time where we can explore the NuoDB Admin or NuoDB Explorer. If you click on Admin, it will first show following login screen. Enter for the username “domain” and for the password “bird”. Alternatively you can enter “quickstart”  twice for username and password.  It works as too. Once you enter into the Admin Section, on the left side you can see information about NuoDB and Admin Console and on the right side you can see the domain overview area. From this Administrative section you can do any of the following tasks: Create a view of the entire domain Add and remove databases Start and stop NuoDB Transaction Engines and Storage Managers Monitor transaction across all the NuoDB databases On the right side of the Admin Section we can see various information about a particular NuoDB domain. You can quickly view various alerts, find out information about the number of host machines that are provisioned for the domain, and see the number of databases and processes that are running in the domain. If you click on the “1 host” link you will be able to see various processes, CPU usage and other information. In the Processes Section you can see that there are two different types of processes. The first process (where you can see the floppy drive icon) represents a running Storage Manager process and the second process a running Transaction Engine process. You can click on the links for the Storage Manager and Transaction Engine to see further statistical details right down to the last byte of the data. There are various charts available for analysis as well. I think the product is quite mature and the user can add different monitor charts to the Admin section. Additionally, the Admin section is the place where you can create and manage new databases. I hope today’s tutorial gives you enough confidence that you can try out NuoDB and checkout various administrative activities with the database. I am personally impressed with their dashboard related to various counters. For more information about how the NuoDB architecture works and what a Storage Manager or Transaction Engine does, check out this short video with NuoDB CTO Seth Proctor:  In the next blog post, we will try out the Explorer section of NuoDB, which allows us to run SQL queries and write SQL code.  Meanwhile, I strongly suggest you download and install NuoDB and get yourself familiar with the product. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: NuoDB

    Read the article

  • Video: Telerik Silverlight Chart showing live data from SharePoint 2010

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). In this video, I demonstrate  - The process of writing, authoring, deploying, configuring, and debugging a Custom WCF service in SharePoint 2010 Integrating it with the Telerik Silverlight RAD Chart, that shows live data from the server showing CPU Usage of your web front end – can be enhanced to show whatever else you want. Doing all this in Visual Studio 2010, how you’d put your project together, how do you go about diagnosing it, debugging it – the whole bit. The whole presentation is about 45 mins, and it’s mostly all code, so plenty of juicy stuff here! At the end of this, you have a pretty sexy app running .. just fast forward to the end of the video below, and you’ll see what I’m talking about. :) You can watch the video here Comment on the article ....

    Read the article

  • Az OTP Bank az Oracle Warehouse Builder-t használja

    - by Fekete Zoltán
    Az Oracle.com-on az ügyfél sikertörténetek között az imént jelent meg a következo dokumentum: OTP Bank Data Warehouse Development Team Improves Service Level and Lowers Reporting Lead Time for Business Fields by 80%, azaz az OTP Bank az adattárház fejlesztéshez az Oracle Warehouse Builder ETL-ELT eszközt használja. AZ OTP Bank Tranzakciós Adattárház fejleszto csapata magasabb minoségi szintre emelte a belso megrendeloknek nyújtott szoltáltatásait, amely egyik eredménye, hogy 80%-al csökkentette az üzletágak közötti riportolási folyamatok átfutási idotartamát. A magyar nyelvu sikertörténet innen töltheto le. A legfontosabb eredmények az OWB kapcsán: - ETL folyamatok sztenderdizációján keresztül elért adatminoség javulás, OWB - Oracle Business Intelligence EE: az üzleti területek és az IT fejlesztés közötti együttmoködés hatékonyabb - sztenderdizált ETL és riportolási folyamatok: - fix jelentés készletek hatására tudatos üzleti metaadat kezelés - egységes terminológia - komplex banki folyamatok pontos ismerete: üzleti területek és IT fejlesztok számára - hatékony banki együttmoködés - a megrendeléstol az adatpublikációig tartó folyamatok idotartama lecsökkent - az ad-hoc riportok elkészítése a korábbi 1,5 hétrol 80%-al, átlagosan 2 munkanapra csökkent

    Read the article

  • SQL SERVER – Identifying Column Data Type of uniqueidentifier without Querying System Tables

    - by pinaldave
    I love interesting conversations with related to SQL Server. One of my friends Madhivanan always comes up with an interesting point of conversation. Here is one of the conversation between us. I am very confident this blog post will for sure enable you with some new knowledge. Madhi: How do I know if any table has a uniqueidentifier column used in it? Pinal:  I am sure you know that you can do it through some DMV or catalogue views. Madhi: I know that but how can we do that without using DMV or catalogue views? Pinal: Hm… what can I use? Madhi: You can use table name. Pinal: Easy, just say SELECT YourUniqueIdentCol FROM Table. Madhi: Hold on, the question seems to be not clear to you – you do know the name of the column. The matter of the fact, you do not know if the table has uniqueidentifier column. Only information you have is table name. Pinal: Madhi, this seems like you are changing the question when I am close to answer. Madhi: Well, are you clear now? Let me say it again – How do I know if any table has a uniqueidentifier column and what is its value without using any DMV or System Catalogues? Only information you know is table name and you are allowed to return any kind of error if table does not have uniqueidentifier column. Pinal: Do you know the answer? Madhi: Yes. I just wanted to test your knowledge about SQL. Pinal: I will have to think. Let me accept I do not know it right away. Can you share the answer please? Madhi: I won! Here it goes! Pinal: When I have friends like you – who needs enemies? Madhi: (laughter which did not stop for a minute). CREATE TABLE t ( GuidCol UNIQUEIDENTIFIER DEFAULT newsequentialid() ROWGUIDCOL, data VARCHAR(60) ) INSERT INTO t (data) SELECT 'test' INSERT INTO t (data) SELECT 'test1' SELECT $rowguid FROM t DROP TABLE t This is indeed very interesting to me. Please note that this is not the optimal way and there will be many other ways to retrieve uniqueidentifier name and value. What I learned from this was if I am in a rush to check if the table has uniqueidentifier and I do not know the name of the same, I can use SELECT TOP (1) $rowguid and quickly know the name of the column. I can later use the same columnname in my query. Madhi did teach me this new trick. Did you know this? What are other ways to get the check uniqueidentifier column existence in a database? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Le service de "Cloud computing" de jeux vidéo OnLive annonce ses dates et son prix à la Game Develop

    Mise à jour du 11/03/10 Le service de "Cloud computing" de jeux vidéo OnLive annonce ses dates et son prix à la Game Developers Conference 2010 Se déroulant actuellement, la Game Developers Conference 2010 a offert l'occasion à Mike McGarvey, responsable du projet OnLive, d'officialiser certains points sur son projet de "could computing" pour les jeux vidéo. On apprend ainsi que le service sera disponible à partir du 17 juin prochain sur la sol américain. Rien n'est précisé quand à sa disponibilité du service en Europe. Le service demandera au client de s'abonner mensuellement pour un prix de 14,95$ (soit environ 11€ par mois). Le service sera dans un p...

    Read the article

  • Where do service implementations fit into the Microsoft Application Architecture guidelines?

    - by tuespetre
    The guidelines discuss the service layer with its service interfaces and data/message/fault contracts. They also discuss the business layer with its logic/workflow components and entities as well as the 'optional' application facade. What is unclear still to me after studying this guide is where the implementations of the service interfaces belong. Does the application facade in the business layer implement these interfaces, or does a separate 'service facade' exist to make calls to the business layer and it's facade/raw components? (With the former, there would be less seemingly trivial calls to yet another layer, though with the latter I could see how the service layer could remove the concerns of translating business entities to data contracts from the business layer.)

    Read the article

  • in memory datastore in haskell

    - by Simon
    I want to implement an in memory datastore for a web service in Haskell. I want to run transactions in the stm monad. When I google hash table steam Haskell I only get this: Data. BTree. HashTable. STM. The module name and complexities suggest that this is implemented as a tree. I would think that an array would be more efficient for mutable hash tables. Is there a reason to avoid using an array for an STM hashtable? Do I gain anything with this stem hash table or should I just use a steam ref to an IntMap?

    Read the article

< Previous Page | 189 190 191 192 193 194 195 196 197 198 199 200  | Next Page >