Search Results

Search found 30367 results on 1215 pages for 'service reference'.

Page 562/1215 | < Previous Page | 558 559 560 561 562 563 564 565 566 567 568 569  | Next Page >

  • When is it appropriate to use error codes?

    - by Jim Hurne
    In languages that support exception objects (Java, C#), when is it appropriate to use error codes? Is the use of error codes ever appropriate in typical enterprise applications? Many well-known software systems employ error codes (and a corresponding error code reference). Some examples include operating systems (Windows), databases (Oracle, DB2), and middle-ware products (WebLogic, WebSphere). What benefits do error codes provide? What are the disadvantages to using error codes?

    Read the article

  • Get textbox binding in wpf datagridtemplatecolumn

    - by klawusel
    Hi I have a wpf datagrid containing multiple datagridtemplatecolumns, which all are build up from a datatemplate which contains a textbox. Now I want to get the binding of the textbox (I have a reference to the template column which textbox's binding I woukld like to determine). Alternatively it would be nice to return the X:Name of the template column Any hints? Regards klaus

    Read the article

  • Where to store 3rd party libraries?

    - by zerkms
    I have asp.net mvc 2 application. Now I'm reimplementing it for working with Ninject. All is fine except one thing: where should I store Ninject.dll?? I've created lib directory inside my appdir and made reference to lib/Ninject.dll. But may be there are some general conventions on how to act in such cases?

    Read the article

  • WCF, IIS6.0 (413) Request Entity Too Large.

    - by Andrew Kalashnikov
    Hello, guys. I've got annoyed problem. I've got WCF service(basicHttpBinding with Transport security Https). This service implements contract which consists 2 methods. LoadData. GetData. GetData works OK!. My client received pachage ~2Mb size without problems. All work correctly. But when I try load data by bool LoadData(Stream data); - signature of method I'll get (413) Request Entity Too Large. Stack Trace: Server stack trace: ? ServiceModel.Channels.HttpChannelUtilities.ValidateRequestReplyResponse(HttpWebRequest request, HttpWebResponse response, HttpChannelFactory factory, WebException responseException, ChannelBinding channelBinding) System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelRequest.WaitForReply(TimeSpan timeout) System.ServiceModel.Channels.RequestChannel.Request(Message message, TimeSpan timeout) System.ServiceModel.Dispatcher.RequestChannelBinder.Request(Message message, TimeSpan timeout) I try this http://blogs.msdn.com/jiruss/archive/2007/04/13/http-413-request-entity-too-large-can-t-upload-large-files-using-iis6.aspx. But it doesn't work! My server is 2003 with IIS6.0. Please help.

    Read the article

  • Problem with imbricate panels in java...

    - by stefan89
    i have a panel in another panel and i want to access an member of the child panel from the parent panel. The child panel reference that is in the parent panel doesn't see all the members that it has. Thanks! PS : the members i can't access are public

    Read the article

  • How to transform SoapFault to SoapMessage via Interceptor in CXF?

    - by Michal Mech
    I have web-service created and configured via Spring and CXF. See beans below: <?xml version="1.0" encoding="UTF-8"?> <beans <!-- ommited -->> <import resource="classpath:META-INF/cxf/cxf.xml" /> <import resource="classpath:META-INF/cxf/cxf-extension-soap.xml" /> <import resource="classpath:META-INF/cxf/cxf-servlet.xml" /> <bean id="internalActService" class="package.InternalActServiceImpl" /> <jaxws:endpoint implementor="#internalActService" address="/InternalActService"> <jaxws:properties> <entry key="schema-validation-enabled" value="true" /> </jaxws:properties> <jaxws:outFaultInterceptors> <bean class="package.InternalActServiceFaultOutInterceptor" /> </jaxws:outFaultInterceptors> </jaxws:endpoint> </beans> As can you see I added schema validation to my web service. But CXF throws SoapFault when request is not corresponding with schema. I want to send to the client SoapMessage instead of SoapFault, that's why I added outFaultInterceptors. My question is how to transform SoapFault to SoapMessage? I've made few tries but I don't know how to implement outFaultInterceptor.

    Read the article

  • Perl, array referencing

    - by Mike
    Consider this Perl code my @a=[[1]]; print $a[0][0] . "\n"; **output** ARRAY(0x229e8) Why does it print an Array instead of 1? I would have expected @a to create an array of size 1 with a reference to a second array containing only one element, 1

    Read the article

  • NHibernate (3.1.0.4000) NullReferenceException using Query<> and NHibernate Facility

    - by TigerShark
    I have a problem with NHibernate, I can't seem to find any solution for. In my project I have a simple entity (Batch), but whenever I try and run the following test, I get an exception. I've triede a couple of different ways to perform a similar query, but almost identical exception for all (it differs in which LINQ method being executed). The first test: [Test] public void QueryLatestBatch() { using (var session = SessionManager.OpenSession()) { var batch = session.Query<Batch>() .FirstOrDefault(); Assert.That(batch, Is.Not.Null); } } The exception: System.NullReferenceException : Object reference not set to an instance of an object. at NHibernate.Linq.NhQueryProvider.PrepareQuery(Expression expression, ref IQuery query, ref NhLinqExpression nhQuery) at NHibernate.Linq.NhQueryProvider.Execute(Expression expression) at System.Linq.Queryable.FirstOrDefault(IQueryable`1 source) The second test: [Test] public void QueryLatestBatch2() { using (var session = SessionManager.OpenSession()) { var batch = session.Query<Batch>() .OrderBy(x => x.Executed) .Take(1) .SingleOrDefault(); Assert.That(batch, Is.Not.Null); } } The exception: System.NullReferenceException : Object reference not set to an instance of an object. at NHibernate.Linq.NhQueryProvider.PrepareQuery(Expression expression, ref IQuery query, ref NhLinqExpression nhQuery) at NHibernate.Linq.NhQueryProvider.Execute(Expression expression) at System.Linq.Queryable.SingleOrDefault(IQueryable`1 source) However, this one is passing (using QueryOver<): [Test] public void QueryOverLatestBatch() { using (var session = SessionManager.OpenSession()) { var batch = session.QueryOver<Batch>() .OrderBy(x => x.Executed).Asc .Take(1) .SingleOrDefault(); Assert.That(batch, Is.Not.Null); Assert.That(batch.Executed, Is.LessThan(DateTime.Now)); } } Using the QueryOver< API is not bad at all, but I'm just kind of baffled that the Query< API isn't working, which is kind of sad, since the First() operation is very concise, and our developers really enjoy LINQ. I really hope there is a solution to this, as it seems strange if these methods are failing such a simple test. EDIT I'm using Oracle 11g, my mappings are done with FluentNHibernate registered through Castle Windsor with the NHibernate Facility. As I wrote, the odd thing is that the query works perfectly with the QueryOver< API, but not through LINQ.

    Read the article

  • Dynamically created controls and the ASP.NET page lifecycle

    - by Dirk
    I'm working on an ASP.NET project in which the vast majority of the forms are generated dynamically at run time (form definitions are stored in a DB for customizability). Therefore, I have to dynamically create and add my controls to the Page every time OnLoad fires, regardless of IsPostBack. This has been working just fine and .NET takes care of managing ViewState for these controls. protected override void OnLoad(EventArgs e) { base.OnLoad(e); RenderDynamicControls() } private void RenderDynamicControls(){ //1. call service layer to retrieve form definition //2. create and add controls to page container } I have a new requirement in which if a user clicks on a given button (this button is created at design time) the page should be re-rendered in a slightly different way. So in addition to the code that executes in OnLoad (i.e. RenderDynamicControls()), I have this code: protected void MyButton_Click(object sender, EventArgs e) { RenderDynamicControlsALittleDifferently() } private void RenderDynamicControlsALittleDifferently() (){ //1. clear all controls from the page container added in RenderDynamicControls() //2. call service layer to retrieve form definition //3. create and add controls to page container } My question is, is this really the only way to accomplish what I'm after? It seems beyond hacky to effectively render the form twice simply to respond to a button click. I gather from my research that this is simply how the page-lifecycle works in ASP.NET: Namely, that OnLoad must fire on every Postback before child events are invoked. Still, it's worthwhile to check with the SO community before having to drink the kool-aid. On a related note, once I get this feature completed, I'm planning on throwing an UpdatePanel on the page to perform the page updates via Ajax. Any code/advice that make that transition easier would be much appreciated. Thanks

    Read the article

  • Is there the equivalent of cloud computing for modems?

    - by morpheous
    I asked this question on SF, and someone recommended that I ask it here - (I don't think I have enough points to move a question from SF to SO - and in any case, I don't know how to do it - so here is the question again): I am interested in the concept of PAAS (platform as a service). However, all talk about SAAS/PAAS seems to focus on only the computer itself - not its peripherals. Is it possible to 'outsource' modems as a resource - so that an app running remotely can pump data to a modem in the cloud? As a bit of background to the question, a group of us are thinking of starting a company that offers similar services to companies like twilio etc - but I want to 'outsource' both the computing hardware (thats PAAS - the easy bit) and the modems (thats what I cant seem to find any info on). Does anyone know if modems can be bundled as part of a PAAS service? - alternatively, is there a way that an application running on one computer can communicate (i.e. pump data) to a remote modem residing on another machine?. I assume I can come up with some protocol over UDP or TCP - but there is no point reinventing the wheel - if such a protocol like that already exists (or if it some open source software allows one to do this). Any suggestions on how to solve this problem?

    Read the article

  • Translating CURL to FLEX HTTPRequests

    - by Joshua
    I am trying to convert from some CURL code to FLEX/ActionScript. Since I am 100% ignorant about CURL and 50% ignorant about Flex and 90% ignorant on HTTP in general... I'm having some significant difficulty. The following CURL code is from http://code.google.com/p/ga-api-http-samples/source/browse/trunk/src/v2/accountFeed.sh I have every reason to believe that it's working correctly. USER_EMAIL="[email protected]" #Insert your Google Account email here USER_PASS="secretpass" #Insert your password here googleAuth="$(curl https://www.google.com/accounts/ClientLogin -s \ -d Email=$USER_EMAIL \ -d Passwd=$USER_PASS \ -d accountType=GOOGLE \ -d source=curl-accountFeed-v2 \ -d service=analytics \ | awk /Auth=.*/)" feedUri="https://www.google.com/analytics/feeds/accounts/default\ ?prettyprint=true" curl $feedUri --silent \ --header "Authorization: GoogleLogin $googleAuth" \ --header "GData-Version: 2" The following is my abortive attempt to translate the above CURL to AS3 var request:URLRequest=new URLRequest("https://www.google.com/analytics/feeds/accounts/default"); request.method=URLRequestMethod.POST; var GoogleAuth:String="$(curl https://www.google.com/accounts/ClientLogin -s " + "-d [email protected] " + "-d Passwd=secretpass " + "-d accountType=GOOGLE " + "-d source=curl-accountFeed-v2" + "-d service=analytics " + "| awk /Auth=.*/)"; request.requestHeaders.push(new URLRequestHeader("Authorization", "GoogleLogin " + GoogleAuth)); request.requestHeaders.push(new URLRequestHeader("GData-Version", "2")); var loader:URLLoader=new URLLoader(); loader.dataFormat=URLLoaderDataFormat.BINARY; loader.addEventListener(Event.COMPLETE, GACompleteHandler); loader.addEventListener(IOErrorEvent.IO_ERROR, GAErrorHandler); loader.addEventListener(SecurityErrorEvent.SECURITY_ERROR, GAErrorHandler); loader.load(request); This probably provides you all with a good laugh, and that's okay, but if you can find any pity on me, please let me know what I'm missing. I readily admit functional ineptitude, therefore letting me know how stupid I am is optional.

    Read the article

  • how to create nettcpbinding of this custombinding

    - by evgeni
    I am new at wcf programming model and i want to use nettcpbinding. before i ask my question below is my custom binding : <customBinding> <binding name="basic"> <security authenticationMode="UserNameForCertificate"/> <binaryMessageEncoding/> <httpsTransport/> </binding> </customBinding> when i create a service reference using a simple console application it finds a certificate and ask me to use it.and this way i can use the webservice ... but when i change binding to nettcpbinding with TransportWithMessageCredential the service is looking for certificate and could not find it like this : </binding> </netTcpBinding> ServiceCertificate.SetCertificate(StoreLocation.LocalMachine, StoreName.My, X509FindType.FindByIssuerName, "Contoso.com"). at this point i use a customnamevalidator and i do it programmatically. so why when i use nettcpbinding with TransportWithMessageCredential why is SetCertificate could not find the installed certificate ? am i missed something ? or do i have to ad something? thanks .

    Read the article

  • How to add a Favicon to RSS feeds using Wordpress?

    - by Josh
    A friend of mine wants to make her favicon visible when people user Google Reader to view the RSS of her Wordpress blog. Anyone have a quick tip on how / where to make that reference? Her current web host is Bluehost, and apparently that is the "icon" people see when using Google Reader. Any suggestions would be helpful.

    Read the article

  • Best strategy for moving data between physical tiers in ASP.net

    - by Pete Lunenfeld
    Building a new ASP.net application, and planning to separate DB, 'service' tier and Web/UI tier into separate physical layers. What is the best/easiest strategy to move serialized objects between the service tier and the UI tier? I was considering serializing POCOs into JSON using simple ASP.net pages to serve the middle tier. Meaning that the UI/Web tier will request data from a (hidden to the outside user) web server that will return a JSON string. This kind of JSON 'emitter' seems easily testable. It also seems easily compressible for efficiently moving data over the WAN between tiers. I know that some folks use .asmx webservices for this kind of task, but this seems like there is excess overhead with SOAP, and the package is not as human readable (testable) as POCOs serialized as JSON. Others are using more complex technology like WCF which we have never used. Does anyone have advice for choosing a method for moving data/objects between the data (db) tier and the web (UI) tier over the WAN using .net technologies? Thanks!!!

    Read the article

  • Does GetSystemInfo (on Windows) always return the number of logical processors?

    - by mhughes
    Reading up on this, and specifically reading the Microsoft docs, it looks like it should be returning the number of PHYSICAL processors, and that you should use GetLogicalProcessorInformation to figure out how many LOGICAL processors you have. Here's the doc I found on the SYSTEM_INFO structure: http://msdn.microsoft.com/en-us/library/ms724958(v=VS.85).aspx And here's the doc on GetLogicalProcessorInformation: (spaces added to get through spam filter) http:// msdn.microsoft.com/ en-us/ library/ ms683194.aspx Reading up on it further though, in most of the discussions I've found on this topic, developers say to that GetSystemInfo (and the SYSTEM_INFO structure) report the number of LOGICAL processors. When I search again, I find that MS did release some info on this (and a hot fix), here (spaces added to get through spam filter): http:// support. microsoft.com/ kb/936235 Reading that, it sounds like on Xp, pre-service Pack 3, GetSystemInfo reports the number of LOGICAL processors in the SYSTEM_INFO structure. It also reads to me that on Windows Vista and Windows 7, GetSystemInfo should be reporting the number of PHYSICAL processors (different to Windows XP pre-service Pack 3). Does anyone know what it actually does? Does GetSystemInfo really report the number of physical processors (on the same computer) differently, depending on which OS it's running on?

    Read the article

  • Force import module from Python standard library instead of PYTHONPATH default

    - by jrdioko
    I have a custom module in one of the directories in my PYTHONPATH with the same name as one of the standard library modules, so that when I import module_name, that module gets loaded. If I want to use the original standard library module, is there any way to force Python to import from the standard library rather than from the PYTHONPATH directory, short of renaming the custom module and changing every reference to point to the new name?

    Read the article

  • Visual Studio 2008: Can't connect to known good TFS 2010 beta 2

    - by p.campbell
    A freshly installed TFS 2010 Beta 2 is at http://serverX:8080/tfs. A Windows 7 developer machine with VS 2008 Pro SP1 and the VS2008 Team Explorer (no SP). The TFS 2008 Service Pack 1 didn't work for me - "None of the products that are addressed by this software update are installed on this computer." The developer machine is able to browse the TFS site at the above URL. The Issue is around trying to add the TFS server into the Team Explorer window in Visual Studio 2008. Here's a screenshot showing the error: unable to connect to this Team Foundation Server. Possible reasons for failure include: The Team Foundation Server name, port number or protocol is incorrect. The Team Foundation Server is offline. Password is expired or incorrect. The TFS server is up and running properly. Firewall ports are open, and is accessible via the browser on the dev machine!! larger image Question: how can you connect from VS 2008 Pro to a TFS 2010 Beta 2 server? Resolution Here's how I solved this problem: installed VS 2008 Team Explorer as above. re-install VS 2008 Service Pack 1 when adding a TFS server to Team Explorer, you MUST specify the URL as such: http://[tfsserver]:[port]/[vdir]/[projectCollection] in my case above, it was http://serverX:8080/tfs/AppDev-TestProject you cannot simply add the TFS server name and have VS look for all Project Collections on the server. TFS 2010 has a new URL (by default) and VS 2008 doesn't recognize how to gather that list.

    Read the article

  • Does HttpListener work well on Mono?

    - by billpg
    Hi everyone. I'm looking to write a small web service to run on a small Linux box. I prefer to code in C#, so I'm looking to use Mono. I don't want the overhead of running a full web server or Mono's version of ASP.NET. I'm thinking of having a single process with a thread dealing with each client connection. Shared memory between threads instead of a database. I've read a little on Microsoft's version of HttpListener and how it works with the Http.sys driver. Alas, Mono's documentation on this class is just the automated class interface with no discussion of how it works under the hood. (Linux doesn't have Http.sys, so I imagine it's implemented substantially differently.) Could anyone point me towards some resources discussing this module please? Many thanks, Bill, billpg.com (A little background to my question for the interested.) Some time ago, I asked this question, interested in keeping a long conversation open with lots of back-and-forth. I had settled on designing my own ad-hoc protocol, but people I spoke to really wanted a REST interface, even at the cost of the "Okay, send your command now" signal. So, I wondered about running ASP.NET on a Linux/Mono server, but stumbled upon HttpListener. This seemed ideal, as each "conversation" could run in a separate thread. The thread that calls HttpListener in a loop can look for which thread each incomming connection is for and pass the reference to that thread. The alternative for an ASP.NET driven service, would be to have the ASPX code pick up the state from a database, and write back the new state when it finishes. Yes, it would work, but that's a lot of overhead.

    Read the article

  • Minimum privileges to read SQL Jobs using SQL SMO

    - by Gustavo Cavalcanti
    I wrote an application to use SQL SMO to find all SQL Servers, databases, jobs and job outcomes. This application is executed through a scheduled task using a local service account. This service account is local to the application server only and is not present in any SQL Server to be inspected. I am having problems getting information on job and job outcomes when connecting to the servers using a user with dbReader rights on system tables. If we set the user to be sysadmin on the server it all works fine. My question to you is: What are the minimum privileges a local SQL Server user needs to have in order to connect to the server and inspect jobs/job outcomes using the SQL SMO API? I connect to each SQL Server by doing the following: var conn = new ServerConnection { LoginSecure = false, ApplicationName = "SQL Inspector", ServerInstance = serverInstanceName, ConnectAsUser = false, Login = user, Password = password }; var smoServer = new Server (conn); I read the jobs by reading smoServer.JobServer.Jobs and read the JobSteps property on each of these jobs. The variable server is of type Microsoft.SqlServer.Management.Smo.Server. user/password are of the user found in each SQL Server to be inspected. If "user" is SysAdmin on the SQL Server to be inspected all works ok, as well as if we set ConnectAsUser to true and execute the scheduled task using my own credentials, which grants me SysAdmin privileges on SQL Server per my Active Directory membership. Thanks!

    Read the article

  • Java : HTTP POST Request

    - by SpunkerBaba
    I have to do a http post request to a web-service for authenticating the user with username and password. The Web-service guy gave me following information to construct HTTP Post request. POST /login/dologin HTTP/1.1 Host: webservice.companyname.com Content-Type: application/x-www-form-urlencoded Content-Length: 48 id=username&num=password&remember=on&output=xml The XML Response that i will be getting is <?xml version="1.0" encoding="ISO-8859-1"?> <login> <message><![CDATA[]]></message> <status><![CDATA[true]]></status> <Rlo><![CDATA[Username]]></Rlo> <Rsc><![CDATA[9L99PK1KGKSkfMbcsxvkF0S0UoldJ0SU]]></Rsc> <Rm><![CDATA[b59031b85bb127661105765722cd3531==AO1YjN5QDM5ITM]]></Rm> <Rl><![CDATA[[email protected]]]></Rl> <uid><![CDATA[3539145]]></uid> <Rmu><![CDATA[f8e8917f7964d4cc7c4c4226f060e3ea]]></Rmu> </login> This is what i am doing HttpPost postRequest = new HttpPost(urlString); How do i construct the rest of the parameters?

    Read the article

  • RIA Services for transmitting non DB object-graph

    - by Mike Gates
    I have been getting into RIA services because I thought it would simplify dealing with the services layer of web applications I wish to build. I see lots of examples out there showing how to create DomainService classes which expose and consume entities that have some kind of relational database backing, and therefore have foreign-key relationships. However, I would like to know how to expose and consume normal object graphs...objects that contain references to eachother but don't have foreign keys. For example, say I want a service operation called "GetFolderInformation(string pathToFolder)". I want this to return a custom object called "FolderInformation" structured with: - string Name - IEnumerable<FileInformation> Files I cannot get this to work because it seems that RIA wants to deal with entities that have foreign key relationships. Why? Why can't the serializer just see my object references and recreate that in the proxy on the other side? Data exists behind service layers that doesn't necessarily have foreign key relationships...like folder/file for example. EDIT: I realized I hadn't asked my question! My question is, is there a way to do what I am trying to do?

    Read the article

  • Accessing the ASP.NET Cache from a Separate Thread?

    - by maxp
    Normally i have a static class that reads and writes to HttpContext.Current.Cache However since adding threading to my project, the threads all get null reference exceptions when trying to retrieve this object. Is there any other way i can access it, workarounds or another cache i can use?

    Read the article

  • Accessing Sabre Web Services using PHP

    - by Peter
    I have been approached to create a website using Sabre Web Services to power the reservations system. All documentation I have seen refers to .NET or Java solutions, I was in doubt whether PHP can be used as access is performed using SOAP. I have found no further information about this, I assume the answer is yes, but wonder why there is not a single reference to this being possible - all solutions seem to be .NET Any suggestions? Thanks!

    Read the article

  • POSTing JSON data to WCF REST

    - by Randall Sexton
    I'm trying to send data from a client application using jQuery to a REST WCF service based on the WCF REST starter kit. Here's what I have so far. Service Definition: [WebHelp(Comment = "Save PropertyValues to the database")] [WebInvoke(Method = "POST", UriTemplate = "PropertyValues_Save", BodyStyle = WebMessageBodyStyle.WrappedRequest, RequestFormat = WebMessageFormat.Json, ResponseFormat = WebMessageFormat.Json)] [OperationContract] public bool PropertyValues_Save(Guid assetId, Dictionary<Guid, string> newValues) { ... } Call from the client: $.ajax({ url:SVC_PROPERTYVALUES_SAVE, type: "POST", contentType: "application/json; charset=utf-8", data: jsonData, dataType: "json", error: function(XMLHttpRequest, textStatus, errorThrown) { alert(textStatus + ' ' + errorThrown); }, success: function(data) { if (data) { alert('Values saved'); $("#confirmSubmit").dialog('close'); } else { alert('Values failed to save'); $("#confirmSubmit").dialog('close'); } } }); Example of the JSON being passed: { "assetId": "d70714c3-e403-4cc5-b8a9-9713d05b2ee0", "newValues": [ { "key": "bd01aa88-b48d-47c7-8d3f-eadf47a46680", "value": "0e9fdf34-2d12-4639-8d70-19b88e753ab1" }, { "key": "06e8eda2-a004-450e-90ab-64df357013cf", "value": "1d490aec-f40e-47d5-865c-07fe9624f955" } ] } I'm using Windows Authentication on the virtual directory. When I call operations that are GETs, everything is fine. This code is prompting the browser to log in. When I enter my credentials, I simply get an alert in my browser which says "error undefined". Even if you can't help my specific error, do you see anything that looks wrong from glancing? I've been beating my head on this nearly all day. Thanks in advance.

    Read the article

< Previous Page | 558 559 560 561 562 563 564 565 566 567 568 569  | Next Page >