Search Results

Search found 14056 results on 563 pages for 'odata services'.

Page 90/563 | < Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >

  • Remote connection to a Windows 2008 Server Web edition

    - by Lorenzo
    Hello I have just installed Windows 2008 web server to have a development/test site on my office. In the test network I only have 2 machines: Windows server 2008 Web Edition Vista x64 client machine with Visual Studio The client and the server are networked using a NETGEAR router. I have enabled Remote desktop on the server and when I try to connect to it within the Vista client I get the credential window as in the following screenshot. But even if I write the correct credentials I am not able to remote login on the server. Where am I doing wrong? Update 1 I have even tried to create a folder share on the server. But I am not able to access it for the same reason. User or password invalid it says. But this is impossible as I am logging in the server with the same credentials. Update 2 If I try to browse the network from the RDP client I receive a message saying that there are no server running Terminal Services in my network.... :O

    Read the article

  • Software to monitor bill payment to mission critical IT service providers (ISP, DNS etc.)

    - by Sholom
    Hi All, The Problem: Our very likable but absent minded bookkeeper keeps neglecting to pay our IT vendors on time. Just this past week our internet service was disconnected. Same could happen to many other mission critical accounts (domain registrar, backup MX, anti-virus license, HackerSafe (McAfee secure) service and even an 800 number to name a few). As the sysadmin, i monitor my severs to make sure they are plugged into the power-outlet. I believe i should also monitor my services to make sure they are plugged in to their money-outlet. To compound the problem, when the power goes out someone else will likely notice and notify me. But if a bill is not payed, no one will ever notice until service is lost. Lost as in losing our domain name which would cause a lot more damage then the power failing on our server. [Solution] = [Doesn't work because]: Retrain the bookkeeper = Wishful thinking. Notify my manager = Already have (via email). Protects me, does not solve problem. Fire bookkeeper = What makes you so sure the next one will never forget? Bottom line: Humans are humans and sooner or later something critical will be royally messed up. We need to partner with a machine to help us out here. Anybody have the same problem? What software/solution do you use? I would like software that emails me when a bill is passed due just like i get an email when the power outlet fails. Anyone hear of anything like that? Thanks

    Read the article

  • AWS own email domain and some generic questions

    - by John Brunner
    I'm getting started with Amazon Web Services and I have a few question I'm not sure about. As every (company) webpage I want to use an "[email protected]" email adress, but how is that done? I looked up at godaddy.com (for domain registration), the offer me an email adress like I want, but for 3 dollars per month. Is this possible with AWS? Because at AWS you have just a complex domain which is not very userfriendly or serious. Also I want to host my dynamic webpage on the amazon cloud, but I'm not sure if I'm doing that right. I've read many guides, and all I know is that I have to purchase a Elastic Compute Cloud, and a Simple Storage Service... and every guide is working with the basic linux package, why not Windows? Is it more expensive? I just want to host a mySQL Server for the dynamic webpage, which is reached over a normal domain. And one last question, if I sign up for an AWS account it asks me for an email account. But I found it a little bit unserious to write there my free-webmailer-adress... How is it done the normal way? Thanks in advance! Best regards, john.

    Read the article

  • Map a URL bought with Dreamhost to Amazon EC2 (AWS)

    - by Edan Maor
    I have several URLs I purchased through Dreamhost. I'm starting to use Amazon's AWS, and I'd like to map the URLs to Amazon. This is something of a silly question, and I've already done the same thing several times to other services (mapping from Dreamhost to WebFaction). But for some reason when I tried to find the proper way to do the same mapping to Amazon, I find a lot of detailed writing talking about whether I should be using CNAME or A records, etc. So I wanted to ask in the simplest possible terms and hopefully get a simple, concrete answer: I bought a URL from Dreamhost, I have an EC2 server running on AWS (to which I already mapped an Elastic IP address). How do I make the URL map to AWS? And if there are several options, which one should I effectively be using? P.S. Meta-question - why are things so much more difficult with AWS? When I search Google for "Move from Dreamhost to WebFaction, I get very simple answers on how to do the mapping. In what way is AWS different?

    Read the article

  • Migrating to AWS Cloud with auto-scaling - where to put Redis and ElasticSearch?

    - by RobMasters
    I've been trying to research this topic but haven't found anywhere that recommends where to install services such as Redis and ElasticSearch when migrating to a cloud framework. I'm currently running a Symfony2 application on 2 static servers - one is running MySQL and the other is the public facing web server, which also has Redis and ElasticSearch running on it. Both of these servers are virtualised, but they're static in terms of not being able to replicate at present (various aspects are still dependent on the local filesystem). The goal is to migrate to AWS and use auto-scaling to be able to spin up and kill web servers as required, but I'm not clear on what I should put on each EC2 instance. Should they be single-responsibility only? i.e. Set up individual instances for the web server(s), Redis, and ElasticSearch and most likely an RDS instance for MySQL and only set up auto-scaling on the web server(s)? I don't foresee having to scale the ElasticSearch server anytime soon as it's only driving the search functionality, but it's possible that Redis may need to be replicated at some point - but should this be done manually? I'm not sure of how this could be done automatically as each instance needs to be configured to know about it's master/slave(s) as far as I know. I'd appreciate advice on this. One more quick question while I'm here - how would I be able to deploy code changes when there are X web servers currently active? I'm using a Capifony deployment script (Symfony2 version of Capistrano), which I think can handle multiple servers easily enough by specifying an array of :domain addresses...but how can should this be handled when the number of web servers can vary?

    Read the article

  • Suddenly getting lock timeouts with MySQL

    - by Marc Hughes
    We've got a web app hosted on Amazon Web services. Our database is a multi-az RDS MySQL server running 5.1.57 and 3-4 app servers talk to it. Today, we started seeing a lot of errors along the lines of "Lock wait timeout exceeded; try restarting transaction" - almost 1% of POST requests are seeing this. There have been no modifications to the code running on the site. There have been no schema changes. We haven't had a big spike in traffic. I've been looking at the processes running, and none seem out of control. I tried scaling our RDS instance from a small to a large, with no effect. Two days ago, Amazon had some outages. As part of the recovery from that, our RDS server, and our app servers ended up in different availability zones, but all within the same region. But yesterday, everything was fine so I'm not convinced that's related. The lock timeouts are in different types of requests and occur in different InnoDB tables. I have noticed the number of open connections jumped when we started seeing problems, but they may be a symptom and not a cause. What are my next steps in debugging this?

    Read the article

  • Performance data collection for short-running, ephemeral servers

    - by ErikA
    We're building a medical image processing software stack, currently hosted on various AWS resources. As part of this application, we have a handful of long-running servers (database, load balancers, web application, etc.). Collecting performance data on those servers is quite simple - my go-to- recipe of Nagios (for monitoring/notifications) and Munin (for collection of performance data and displaying trends) will work just fine. However - as part of this application, we are constantly starting up and terminating compute instances on EC2. In typical usage, these compute instances start up, configure themselves, receive a job from a message queue, and then get to work processing that job, which takes anywhere from 15 minutes to over 8 hours. After job completion, these instances get terminated, never to be heard from again. What is a decent strategy for collecting performance data on these short-lived instances? I don't necessarily need monitoring on them - if they fail for whatever reason, our application will detect this and handle re-starting the job on another instance or raising the flag so an administrator can take a look at things. However, it still would be useful to collect information like CPU (user, idle, iowait, etc.), memory usage, network traffic, disk read/write data, etc. In our internal database, we track the instance ID of the machine that runs each job, and it would be quite helpful to be able to look up performance data for a specific instance ID for troubleshooting and profiling. Munin doesn't seem like a great candidate, as it requires maintaining a list of munin nodes in a text file - far from ideal for an environment with a high amount of churn, and for the short amount of time each node will be running, I'd rather keep the full-resolution data indefinitely than have RRD water down the data over time. In the end, my guess is that this will require a monitoring engine that: uses a database (MySQL, SQLite, etc.) for configuration and data storage exposes an API for adding/removing hosts and services Are there other things I should be thinking about when evaluating options? Perhaps I'm over-thinking this, though, and just ought to run sar at 1-minute intervals on these short-lived instances and collect the sar db files prior to termination.

    Read the article

  • what is the reason behind window service stopped ,whether its due to LAN problems or any other issues

    - by Steve
    I have a windowservice which named Trunk which stopped one day i just want to know the reason behind it? this is an entry in the logs, Nov 15 17:54:04.318 :Trunk-1516:Trunk:handle_control_event:Received CTRL_LOGOFF_EVENT, ignore it Nov 25 15:54:52.157 :Trunk-1516:Trunk:ERROR - Process Restart Count (5) Exceeded for:C:\Program Files\secon\11.1.4\bin\vmd Nov 25 15:54:52.157 :Trunk-1516:Trunk:Stopping Trunk ... Nov 25 15:54:52.314 :Trunk-1516:Trunk:Shutting down, signaled C:\Program F Nov 20 15:54:20.345 :SCBridge.RegisterBridge:Exception in method: ScUtility.ScCommandException (0xa08990002): Exception from HRESULT: 0xa08990002 Supplemental Information: None available. at ScServer.ScServiceProcessorRegistryManager.Attach(String serviceProcessor, ScClientInformation clientInfo, FORCE_ATTACH_SPEC forceAttachToMaster) at ScServer.ScServiceProcessorRegistry.Attach(String serviceProcessor, Object clientInfo) at ScServer.ScServiceProcessorRegistry.Attach(String serviceProcessor) at ServerControlInterface.SCBridge.RegisterBridge(String SPName) for system APOLLOSP0 attempting to attach and register with the Bridge i had seen service is registered with specific account, so i thought that user logged off from the machine that may be the reason behind it or any LAN disconnection problem . But Having taken another look at the above entry we seem to have a constant failure being generated in vmd which causes Trunk to detect vmd requires a restart. Most of the time it works OK and the restart count is anything up to 4. In this case the Trunk log confirms that the Restart Count is 5 and so is considered to be exceeded. Presumably, this triggers the termination of the other services and Trunk is actually doing its job.So, coould this just be a timing issue and we need to increase the tolerance level (i.e restart count) or do we need to address the 0xa08990002 error in vmd?

    Read the article

  • Windows service fails to start with local user until password is entered again in logon tab

    - by Nick
    Basically we have a service where we use a local account as its logon. it has all the proper permissions, and everything is working fine, service starts and runs and all is good. Then one day, after rebooting, the service fails to start. Logs show incorrect password. Our technicians resolve the issue by simply retyping the password into the "Log On" tab from the services.msc. Unfortunately we have not been able to root cause. I suspect that the password that is stored for the service is lost somehow. Does anyone know where the password hash might be stored so we can check it? The only activities that seem to be possibly related are patching with Microsoft security patches, but we have multiple servers running the same service, and we have never seen more than one at a time, and its usually a different one each time when this occurrs. I believe this to be the same issue as this: Windows service fails to start with custom user until started once with local user But i was unable to add comments, and its really old.

    Read the article

  • Get the Windows Scheduler's Location Or Code?

    - by Ram
    Today i got a task to find the scheduler that is running once in a month, actually I have to change its running time period to once a week. I have searched for it in the Windows scheduled tasks, but i didn't find it. It is sending a mail containing a link. Now i am not confused about this where else can i find it. As the place i know where the scheduler can be, has already been checked by me. Can someone suggest me where else can i search for this scheduler? As per the previous developer's comment "It is a normal Windows method of scheduling tasks" UPDATE Actually the task is running after a month periodically. As per my knowledge it could be a "Windows Task Schedule" or "Windows Service" created by the old developer. Now as the previous developer is not available and i do not have any documentation.. i need to change the time period from month to weakly. I have checked in the "Task Schedules" on the server and checked the services running ob the server and was unable to find the "Scheduler". now i have two questions: is there any other approach by using that i can schedule an automated email periodically. any idea to find this.

    Read the article

  • What is the correct approach i should use for an application that requires amazon S3 uploads and SimpleDB data management?

    - by Luis Oscar
    I am developing an application for iOS and that is going smoothly, the problem is that I am very new at server sided things. I am totally confused about how to correctly use Amazon Web Services for this purpose. What I want to do is very simple. I want my application to be able to query a servlet hosted in EC2 to be able to retrieve pictures and data based on some criteria from S3 and SImpleDB respectively. Also the application should be able to upload pictures into a S3 bucket and register the information in the SImpleDB. My main concerns are security and costs, So far i was using Amazon Token Vending Machine but I haven't been successful when trying to customize it, and while researching I discovered that on the long run it is very expensive. The ultimate goal is to handle a "social" picture service for my iOS application. Being able to register new users, authenticate these users. See what permissions they have to which pictures from the bucked. And all this without having to worry about Third party people from accessing the private pictures of my users. Sorry for this question but I am really clueless about how to handle this... I have tried reading many articles but all these server stuff looks very scary.

    Read the article

  • How to add a new item in a sharepoint list using web services in C sharp

    - by Frank
    Hi, I'm trying to add a new item to a sharepoint list from a winform application in c# using web services. As only result, I'm getting the useless exception "Exception of type 'Microsoft.SharePoint.SoapServer.SoapServerException' was thrown." I have a web reference named WebSrvRef to http://server/site/subsite/_vti_bin/Lists.asmx And this code: XmlDocument xmlDoc; XmlElement elBatch; XmlNode ndReturn; string[] sValues; string sListGUID; string sViewGUID; if (lstResults.Items.Count < 1) { MessageBox.Show("Unable to Add To SharePoint\n" + "No test file processed. The list is blank.", "Add To SharePoint", MessageBoxButtons.OK, MessageBoxIcon.Exclamation); return; } WebSrvRef.Lists listService = new WebSrvRef.Lists(); sViewGUID = "{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}"; // Test List View GUID sListGUID = "{xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx}"; // Test List GUID listService.Credentials= System.Net.CredentialCache.DefaultCredentials; frmAddToSharePoint dlgAddSharePoint = new frmAddToSharePoint(); if (dlgAddSharePoint.ShowDialog() == DialogResult.Cancel) { dlgAddSharePoint.Dispose(); listService.Dispose(); return; } sValues = dlgAddSharePoint.Tag.ToString().Split('~'); dlgAddSharePoint.Dispose(); string strBatch = "<Method ID='1' Cmd='New'>" + "<Field Name='Client#'>" + sValues[0] + "</Field>" + "<Field Name='Company'>" + sValues[1] + "</Field>" + "<Field Name='Contact Name'>" + sValues[2] + "</Field>" + "<Field Name='Phone Number'>" + sValues[3] + "</Field>" + "<Field Name='Brand'>" + sValues[4] + "</Field>" + "<Field Name='Model'>" + sValues[5] + "</Field>" + "<Field Name='DPI'>" + sValues[6] + "</Field>" + "<Field Name='Color'>" + sValues[7] + "</Field>" + "<Field Name='Compression'>" + sValues[8] + "</Field>" + "<Field Name='Value % 1'>" + (((float)lstResults.Groups["Value 1"].Tag)*100).ToString("##0.00") + "</Field>" + "<Field Name='Value % 2'>" + (((float)lstResults.Groups["Value 2"].Tag)*100).ToString("##0.00") + "</Field>" + "<Field Name='Value % 3'>" + (((float)lstResults.Groups["Value 3"].Tag)*100).ToString("##0.00") + "</Field>" + "<Field Name='Value % 4'>" + (((float)lstResults.Groups["Value 4"].Tag)*100).ToString("##0.00") + "</Field>" + "<Field Name='Value % 5'>" + (((float)lstResults.Groups["Value 5"].Tag)*100).ToString("##0.00") + "</Field>" + "<Field Name='Comments'></Field>" + "<Field Name='Overall'>" + (fTotalScore*100).ToString("##0.00") + "</Field>" + "<Field Name='Average'>" + (fTotalAvg * 100).ToString("##0.00") + "</Field>" + "<Field Name='Transfered'>" + sValues[9] + "</Field>" + "<Field Name='Notes'>" + sValues[10] + "</Field>" + "<Field Name='Resolved'>" + sValues[11] + "</Field>" + "</Method>"; try { xmlDoc = new System.Xml.XmlDocument(); elBatch = xmlDoc.CreateElement("Batch"); elBatch.SetAttribute("OnError", "Continue"); elBatch.SetAttribute("ListVersion", "1"); elBatch.SetAttribute("ViewName", sViewGUID); strBatch = strBatch.Replace("&", "&amp;"); elBatch.InnerXml = strBatch; ndReturn = listService.UpdateListItems(sListGUID, elBatch); MessageBox.Show(ndReturn.OuterXml); listService.Dispose(); } catch(Exception Ex) { MessageBox.Show(Ex.Message + "\n\nSource\n" + Ex.Source + "\n\nTargetSite\n" + Ex.TargetSite + "\n\nStackTrace\n" + Ex.StackTrace, "Error", MessageBoxButtons.OK, MessageBoxIcon.Error); listService.Dispose(); } What am I doing wrong? What am I missing? Please help!! Frank

    Read the article

  • Access to Salesforce.com Data Through Tableau Desktop

    - by dataintegration
    This article will explain how to connect to any of the RSSBus OData Connectors with Tableau's business intelligence tool. While the example uses the Salesforce Connector, the same process can be followed for any of the OData Connectors. Step 1: Download and install both the Salesforce Connector from RSSBus and Tableau Desktop from Tableau. Step 2: Next you will want to configure the Salesforce Connector to connect with your Salesforce.com account. If you browse to the Help tab in the Salesforce Connector application, there is a link to the Getting Started Guide which will walk you through setting up the Salesforce Connector. Step 3: Once you have successfully configured the Salesforce Connector application, you will want to open Tableau and select the Connect to data option at the top left of the window. Step 4: Here you will click on the option labeled OData under the section labeled On a server. Step 5: A new pop up will appear. The box under Step 1 of the pop-up must contain the OData URL of the Salesforce Connector table. You can find this by clicking on the Settings tab of the Salesforce Connector. Once you have found the OData entry URL, you will need to append the table name that you want Tableau to connect with to the OData entry URL. In this example, we will connect to the Account table. Thus, the URL we enter will be: http://localhost:8181/sfconnector/data/conn/odata.rsc/Account. You will also need to add authentication options in this step. To do this, select the Use a Username and Password option in Step 2 of the pop-up and enter the Username and Password of the user who has access to the Salesforce Connector. When you are done, click the Connect button in Step 3 of the pop-up. Step 6: When the connection to the Salesforce Connector is successful, give the connection a name and click the OK button. Step 7: The table columns will be listed on the left side under the Dimensions section of the workspace. Step 8: To view your Salesforce.com data, you can right click under the table name in the Data section at the top left of the dashboard and select the View Data option. Your Saleforce.com data will appear in Tableau.

    Read the article

  • I/O intensive MySql server on Amazon AWS

    - by rhossi
    We recently moved from a traditional Data Center to cloud computing on AWS. We are developing a product in partnership with another company, and we need to create a database server for the product we'll release. I have been using Amazon Web Services for the past 3 years, but this is the first time I received a spec with this very specific hardware configuration. I know there are trade-offs and that real hardware will always be faster than virtual machines, and knowing that fact forehand, what would you recommend? 1) Amazon EC2? 2) Amazon RDS? 3) Something else? 4) Forget it baby, stick to the real hardware Here is the hardware requirements This server will be focused on I/O and MySQL for the statistics, memory size and disk space for the images hosting. Server 1 I/O The very main part on this server will be I/O processing, FusionIO cards have proven themselves extremely efficient, this is currently the best you can have in this domain. o Fusion ioDrive2 MLC 365GB (http://www.fusionio.com/load/-media-/1m66wu/docsLibrary/FIO_ioDrive2_Datasheet.pdf) CPU MySQL will use less CPU cores than Apache but it will use them very hard, the E7 family has 30M Cache L3 wichi provide boost performance : o 1x Intel E7-2870 will be ok. Storage SAS will be good enough in terms of performance, especially considering the space required. o RAID 10 of 4 x SAS 10k or 15k for a total available space of 512 GB. Memory o 64 GB minimum is required on this server considering the size of the statistics database. Warning: the statistics database will grow quickly, if possible consider starting with 128 GB directly, it will help. This server will be focused on I/O and MySQL for the statistics, memory size and disk space for the images hosting. Server 2 I/O The very main part on this server will be I/O processing, FusionIO cards have proven themselves extremely efficient, this is currently the best you can have in this domain. o Fusion ioDrive2 MLC 365GB (http://www.fusionio.com/load/-media-/1m66wu/docsLibrary/FIO_ioDrive2_Datasheet.pdf) CPU MySQL will use less CPU cores than Apache but it will use them very hard, the E7 family has 30M Cache L3 wichi provide boost performance : o 1x Intel E7-2870 will be ok. Storage SAS will be good enough in terms of performance, especially considering the space required. o RAID 10 of 4 x SAS 10k or 15k for a total available space of 512 GB. Memory o 64 GB minimum is required on this server considering the size of the statistics database. Warning: the statistics database will grow quickly, if possible consider starting with 128 GB directly, it will help. Thanks in advance. Best,

    Read the article

  • What is good usage scenario for Rackspace Cloud Files CDN (powered by AKAMAI) [closed]

    - by Andrew Smith
    I have just setup my website as static page via Rackspace CDN / Akamai. www.example.co.uk is an alias for d9771e6f24423091aebc-345678991111238fabcdef6114258d0e1.r61.cf3.rackcdn.com. d9771e6f24423091aebc-345678991111238fabcdef6114258d0e1.r61.cf3.rackcdn.com is an alias for a61.rackcdn.com. a61.rackcdn.com is an alias for a61.rackcdn.com.mdc.edgesuite.net. a61.rackcdn.com.mdc.edgesuite.net is an alias for a63.dscg10.akamai.net. a63.dscg10.akamai.net has address 63.166.98.41 a63.dscg10.akamai.net has address 63.166.98.40 a63.dscg10.akamai.net has IPv6 address 2001:428:4c02::cda8:ecb9 a63.dscg10.akamai.net has IPv6 address 2001:428:4c02::cda8:ed09 The HTTP header: HTTP/1.0 200 OK Last-Modified: Fri, 19 Oct 2012 23:27:41 GMT ETag: fdf9e14b77def799e09e8ce815a521da X-Timestamp: 1350689261.23382 Content-Type: text/html X-Trans-Id: tx457979be3bd746c2b4e5403a1189cdbc Cache-Control: public, max-age=900 Expires: Sat, 27 Oct 2012 22:18:56 GMT Date: Sat, 27 Oct 2012 22:03:56 GMT Content-Length: 7124 Connection: keep-alive I am wondering, if it's really the fastest solution to power the website? By investigating it thru http://www.just-ping.com/ it seems, that from many places the ping is very high, and during quick investigation I found that they use GeoIP to resolve addresses based on WHOIS, which is not accurate and because of that from many places the ping is above 300ms (for example, if ISP is in balgladore and request is routed to bangladore even if it's 300ms, for period of 1 month), while by just using Amazon Web Services and Route 53 Anycast DNS servers and only 4 EC2 instances it seems that for example India is always below 100ms, while using Akamai it goes above 300ms in some cases, and this is because Route 53 is using BGP. By quickly checking the Akamai, it seems that they are not getting feedback from the traffic - the high ping stays constant even if I keep downloading large files and videos, which is opposite to what they say on their website. They state, that they optimize the performance by taking feedback from the requests, while it seems they just use GeoIP with per City resolution (which are mostly big cities). Because of this, AWS with Route 53 / Anycast DNS seems to be much more reliable, as well EdgeCast which is using BGP, but I dont know how much does it cost to deploy static website. Actually, I dont know if EdgeCast is not a lie, because from isolated places there are many errors - so their performance is at the cost of quality of delivery, because of BGP switching the routes during transfer of large files. So I was wondering, what is really Akamai good for, because they dont seem to pose any strength in any field in what I do understand now, except they offer some software based WAF on their website, but what I really care about is the core distribiution, so the question is? Is really Akamai good for Videos? For static websites? ??? I found so far AWS most usable with most consistent ping and stable transfers.

    Read the article

  • SQLAuthority News – Download Whitepaper – Choosing a Tabular or Multidimensional Modeling Experience in SQL Server 2012 Analysis Services

    - by pinaldave
    Data modeling is the most important task for any BI professional. Matter of the fact, the biggest challenge is to organizing disparate data into an analytic model that effectively and efficiently supports the reporting and analysis. SQL Server 2012 introduces BI Semantic Model (BISM), a single model that can support a broad range of reporting and analysis while blending two Analysis Services modeling experiences behind the scenes. Multidimensional modeling – enables BI professionals to create sophisticated multidimensional cubes using traditional online analytical processing (OLAP). Tabular modeling – provides self-service data modeling capabilities to business and data analysts. As data modeling is evolving and business needs are growing new technologies and tools are emerging to help end users to make the necessary adjustment to the reporting and analysis needs. This white paper is will provide practical guidance to help you decide which SQL Server 2012 Analysis Services modeling experience – tabular or multidimensional. Do let me know what do is your opinion as a comment. In simple word – I would like to know when will you use Tabular modeling and when Multidimensional modeling? Download Choosing a Tabular or Multidimensional Modeling Experience in SQL Server 2012 Analysis Services Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, T SQL, Technology

    Read the article

  • System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host

    - by coffeeaddict
    We have 2 identical database tables on our dev server. In both is a table that holds an X509Certificate2 data in one of its fields. When I grab the cert along with the password that I've been using all along, it works over the first database just fine. I run the same code though over the 2nd database and get this error and I don't get why if the setup is exactly the same and the database is also on this server. So I'm not sure why when I switch my connection string to talk to Database2, even though it's setup the same and the code I'm running over it is the same, it complains. Here's the stack trace: An existing connection was forcibly closed by the remote host Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.Net.Sockets.SocketException: An existing connection was forcibly closed by the remote host Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace: [SocketException (0x2746): An existing connection was forcibly closed by the remote host] System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size) +232 [IOException: Unable to read data from the transport connection: An existing connection was forcibly closed by the remote host.] System.Net.Sockets.NetworkStream.Read(Byte[] buffer, Int32 offset, Int32 size) +7035903 System.Net.FixedSizeReader.ReadPacket(Byte[] buffer, Int32 offset, Int32 count) +58 System.Net.Security.SslState.StartReadFrame(Byte[] buffer, Int32 readBytes, AsyncProtocolRequest asyncRequest) +116 System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) +123 System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest) +86 System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) +123 System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest) +86 System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) +123 System.Net.Security.SslState.ProcessReceivedBlob(Byte[] buffer, Int32 count, AsyncProtocolRequest asyncRequest) +86 System.Net.Security.SslState.StartReceiveBlob(Byte[] buffer, AsyncProtocolRequest asyncRequest) +123 System.Net.Security.SslState.ForceAuthentication(Boolean receiveFirst, Byte[] buffer, AsyncProtocolRequest asyncRequest) +7184357 System.Net.Security.SslState.ProcessAuthentication(LazyAsyncResult lazyResult) +217 System.Threading.ExecutionContext.runTryCode(Object userData) +376 System.Runtime.CompilerServices.RuntimeHelpers.ExecuteCodeWithGuaranteedCleanup(TryCode code, CleanupCode backoutCode, Object userData) +0 System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state) +98 System.Net.TlsStream.ProcessAuthentication(LazyAsyncResult result) +1134 System.Net.TlsStream.Write(Byte[] buffer, Int32 offset, Int32 size) +88 System.Net.PooledStream.Write(Byte[] buffer, Int32 offset, Int32 size) +20 System.Net.ConnectStream.WriteHeaders(Boolean async) +360 [WebException: The underlying connection was closed: An unexpected error occurred on a send.] System.Web.Services.Protocols.WebClientProtocol.GetWebResponse(WebRequest request) +857631 System.Web.Services.Protocols.HttpWebClientProtocol.GetWebResponse(WebRequest request) +10 System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters) +243 Pmall.PayPal.PayPalApi.PayPalAPIAASoapBinding.SetExpressCheckout(SetExpressCheckoutReq SetExpressCheckoutReq) in C:\www\ssss\ssss\ssss\ssss\Reference.cs:1304 ssss.PayPal.ExpressCheckout.PayPalCheckout.SetExpressCheckout(PaymentDetailsType[] paymentDetails, String returnURL, String cancelURL, PayPalPaymentFlowType paymentFlowType) in C:\www\ssss\ssss\ssss\PayPalCheckout.cs:96 ssss.Web.ssss.SetExpressCheckout() in C:\www\ssss\ssss\ssss.aspx.cs:83 ssss.Web.ssss.Page_Load(Object sender, EventArgs e) in C:\www\ssss\ssss\Register.aspx.cs:24 System.Web.Util.CalliHelper.EventArgFunctionCaller(IntPtr fp, Object o, Object t, EventArgs e) +25 System.Web.Util.CalliEventHandlerDelegateProxy.Callback(Object sender, EventArgs e) +42 System.Web.UI.Control.OnLoad(EventArgs e) +132 System.Web.UI.Control.LoadRecursive() +66 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +2428

    Read the article

  • Scrum Master Stephen Forte Teaches Agile Development, Silverlight and BI at GIDS 2010

    - by rajesh ahuja
    Great Indian Developer Summit 2010 – Gold Standard for India's Software Developer Ecosystem Bangalore, March 25, 2010: The author of several books on application and database development including Programming SQL Server 2008 and certified Scrum Master Stephen Forte is coming this summer to India's biggest summit for the developer ecosystem - Great Indian Developer Summit. At the summit, Stephen will conduct a workshop guaranteed to give attendees a jump start in taking a certified scrum master exam. Scrum, one of the most popular Agile project management and development methods, which is starting to be adopted at major corporations and on very large projects. After an introduction to the basics of Scrum like project planning and estimation, the Scrum Master, team, product owner and burn down, and of course the daily Scrum, Stephen will show many real world applications of the methodology drawn from his own experience as a Scrum Master. Negotiating with the business, estimation and team dynamics are all discussed as well as how to use Scrum in small organizations, large enterprise environments and consulting environments. Stephen will also discuss using Scrum with virtual teams and an off-shoring environment. He will then take a look at the tools we will use for Agile development, including planning poker, unit testing, and much more. On 20th April at the GIDS.NET Conference, Stephen will also conduct a series of sessions on Microsoft computing technologies. He will teach how to build data driven, n-tier Rich Internet Applications (RIA) with Silverlight 4.0. Line of business applications (LOB) in Silverlight 4.0 are easy by tapping the power of WCF RIA Services, the Silverlight Toolkit, and elevated out of browser support. Stephen's demo centric session will walk you through an example of building a LOB application with Silverlight 4.0. See how Silverlight and WCF RIA Services support domain logic, services, data binding, validation, server based paging, authentication, authorization and much more. Silverlight 4.0 means business. Silverlight runs C# and Visual Basic code, and so it seems natural that a business application might share some code between the Silverlight client and its ASP.NET Web server. You may want to run some code client-side for interactivity, but re-run that code on the server for security or reliability. This is possible, and there are several techniques you can use to accomplish this goal. In Stephen's second talk learn about the various techniques and their pros and cons. Some techniques work better in C#, others in VB. Still others are simpler with a little extra tooling or code-generation. Any serious Silverlight business application will almost certainly face this issue, and this session gets you going fast. In the third talk, Stephen will explain how to properly architect and deploy a BI application using a mix of some exciting new tools and some old familiar ones. He will start with a traditional relational transaction centric database (OLTP) and explore ways to build a data warehouse (OLAP), looking at the star and snowflake schemas. Next he will look at the process of extraction, transformation, and loading (ETL) your OLTP data into your data warehouse. Different techniques for ETL will be described and the various tradeoffs will be discussed. Then he will look at using the warehouse for reporting, drill down, and data analysis in Microsoft Excel's PowerPivot 2010. The session will round off by showing how to properly build a cube and build a data analysis application on top of that cube, and conclude by looking at some tools to help with the data visualization process. Every year, GIDS is a game changer for several thousands of IT professionals, providing them with a competitive edge over their peers, enlightening them with bleeding-edge information most useful in their daily jobs, helping them network with world-class experts and visionaries, and providing them with a much needed thrust in their careers. Attend Great Indian Developer Summit to gain the information, education and solutions you seek. From post-conference workshops, breakout sessions by expert instructors, keynotes by industry heavyweights, enhanced networking opportunities, and more. About Great Indian Developer Summit Great Indian Developer Summit is the gold standard for India's software developer ecosystem for gaining exposure to and evaluating new projects, tools, services, platforms, languages, software and standards. Packed with premium knowledge, action plans and advise from been-there-done-it veterans, creators, and visionaries, the 2010 edition of Great Indian Developer Summit features focused sessions, case studies, workshops and power panels that will transform you into a force to reckon with. Featuring 3 co-located conferences: GIDS.NET, GIDS.Web, GIDS.Java and an exclusive day of in-depth tutorials - GIDS.Workshops, from 20 April to 24 April at the IISc campus in Bangalore. At GIDS you'll participate in hundreds of sessions encompassing the full range of Microsoft computing, Java, Agile, RIA, Rich Web, open source/standards, languages, frameworks and platforms, practical tutorials that deep dive into technical skill and best practices, inspirational keynote presentations, an Expo Hall featuring dozens of the latest projects and products activities, engaging networking events, and the interact with the best and brightest of speakers from around the world. For further information on GIDS 2010, please visit the summit on the web http://www.developersummit.com/ A Saltmarch Media Press Release E: [email protected] Ph: +91 80 4005 1000

    Read the article

  • .net web service: Can't add service reference, only web reference

    - by ScottE
    I have an existing project that consumes web services. One was added as a service reference, and the other as a web reference. I don't recall why one was added as a web reference, but perhaps it's because I couldn't get it to work! The existing service reference for the one web service works fine, so it's not a .net version issue. I can successfully create a service reference for the second web service, but none of the methods are available. The .wsdl shows the schema, but the Reference.vb shows only the Namespace, and none of the methods. To clarify, these are two different 3rd party web service providers. We'd like to move to the service reference so we have more control over the configuration as we're having various issues with timeouts. Anyone come across this before? Edit Does it matter that there are two services at the address?

    Read the article

  • Nhibernate equivalent of LinqToEntitiesDomainService in RIA

    - by VexXtreme
    Hi, When using Entity Framework with RIA domain services, domain services are inherited from LinqToEntitiesDomainService, which, I suppose, allows you to make linq queries on a low level (client-side) which propagate into ORM; meaning that all queries are performed on the database and only relevant results are retrieved to the server and thus the client. Example: var query = context.GetCustomersQuery().Where(x => x.Age > 50); Right now we have a domain service which inherits from DomainService, and retrieves data through NHibernate session as in: virtual public IQueryable<Customer> GetCustomers() { return sessionManager.Session.Linq<Customer>(); } The problem with this approach is that it's impossible to make specific queries without retrieving entire tables to the server (or client) and filtering them there. Is there a way to make linq querying work with NHibernate over RIA like it works with EF? If not, we're willing to switch to EF because of this, because performance impact would be just too severe. Thanks

    Read the article

  • How do I implement IDataServiceMetadataProvider and tell my Data Service to use that custom provider

    - by Pwninstein
    There's no obvious entry point for implementing a custom provider for an ADO.NET Data Service using IDataServiceMetadataProvider, and then telling a Data Service to use that provider. Has anyone had any luck in this area? I've tried implementing this interface on my Data Source class, but none of my breakpoints are hit. There is also no (obvious) way to set the provider from the Data Service's DataServiceConfiguration parameter passed in to the InitializeService function. Any help would be appreciated. Thanks! Data Services Providers (ADO.NET Data Services) IDataServiceMetadataProvider Members

    Read the article

  • Thrift and .NET - Is this is right combination?

    - by Vadi
    I've been evaluating various technologies for a Social Networking project. The Thrift kind of interested me to evaluate. The advantage I see using Thrift is I can even come with C++ services when the computation in any such business is huge and may not fits with .NET etc., Please suggest your comments. My Questions: Is the open-sourced one is production-ready? Is it the right stack for services layer to choose when the application (GUI) is primarily gets developed in ASP.NET and DB is SQL Server? Is there any other caveats

    Read the article

  • Best practices for sending automated daily emails from web service

    - by Tauren
    I am running a web service that currently sends confirmation emails out to new users via the gmail smtp servers. As I'm only getting a few new users each day, this hasn't been a problem. I've recently added new features to the webapp that will require a customized message to be sent out to each user every day. Think of this as similar to the regular messages LinkedIn sends out that give you a status report on the activity in your network. Every user's message will be different. With thousands of users, this means thousands of unique messages will be sent each day. Edit: I've since found that these types of email are called "transactional or relationship messages". Spamtacular has a good article on differentiating between marketing and transactional email. I don't think using gmail's smtp servers will cut it anymore, but I don't know that for sure. I don't know what gmail's maximum outgoing messages per account is (it might be 100/day), but they limit outgoing mail to 500 recipients per message. I'm not sending a single message to 500 recipients, but I'm going to be sending 1000's of customized messages with each recipient getting one per day. I'm interested to learn any best practices for doing this (especially for Java-based webapps). Here are some of my thoughts and concerns on it: Should I set up my own outgoing mail server? If I do this, it seems like I'll have all sorts of other issues to worry about, such as preventing mail server abuse, monitoring bounces, allowing ways to opt-out of emails, etc. Are there any tools or services to help with this? Maybe something like OpenEMM or a services like MailChimp? But those seem focused more toward email marketing campaigns. I don't think I should have the webapp itself handle sending emails as it currently is for new user signups. I'm thinking I should setup a separate messaging server that can access the same backend/datastore as the webapp. Thoughts on this? Should I consider setting up some sort of message queueing service to help with this, such as JMS, RabbitMQ, ActiveMQ, etc.? Do I need to provide users a way to opt-out? Do I need to flag these as bulk messages? I don't really consider these email marketing messages, but I'm unsure what is considered appropriate or proper netiquette. Any advice is appreciated. I'm also very interested in open source tools or web services that simplify things and could help me to ramp up as quickly as possible. Thanks!

    Read the article

  • scvmap, disco, xsd, wsdl, svcinfo and datasource files

    - by David Gray Wright
    We have a WEb Service named, let's say Foo. So there is a Foo.svc file and a code behind Foo.svc.cs. We add a silverlight project and wish to use the Foo.svc services so we add a Service Reference and call it's namespace FooBar. This creates the following files : Reference.cs Reference.svcmap Foo.xsd Foo.disco configuration.svcinfo Foo.wsdl Also various *.datasource files. Over time we update the Foo.svc and add more Web Services (methods and interfaces) and the number of files in the FooBar directory is growing. I have 26 Foo(nn).xsd files in this directory - where nn = 1 to 26. My configuration.svcinfo is upto configuration91.svcinfo. My question is this? Do any of these files need to be version controlled? Can they all be deleted each time you do a build \ deploy (as long as you do an update service reference)?

    Read the article

< Previous Page | 86 87 88 89 90 91 92 93 94 95 96 97  | Next Page >