Search Results

Search found 52798 results on 2112 pages for 'sharepoint web services'.

Page 383/2112 | < Previous Page | 379 380 381 382 383 384 385 386 387 388 389 390  | Next Page >

  • Migrating to Amazon AWS etc: What key statistics/questions should be analyzed and asked?

    - by cerd
    I searched SOverflow pretty extensively for something similar to this set of questions. BACKGROUND: We are a growing 'big(ish)' data chemical data company that are outgrowing our lab and our dedicated production workhorses. Make no mistake, we need to do some serious query optimization. Our data (It comes from a certain govt. agency so the schema and lack of indexing is atrocious). So yes, I know, AWS or EC2 is not a silver bullet in the face of spending time to maybe rework your queries/code entirely 'out of the box'. With that said I would appreciate any input on the following questions: We produce on CentOS and lab on Ubuntu LTS which I prefer especially with their growing cloud / AWS integration. If we are mysql centric, and our biggest problem is these big cartesian products that produce slow queries, should we roll out what we know after more optimization with respect to Ubuntu/mySQL with the added Amazon horsepower? Or is there some merit to the NoSQL and other technologies they offer? What are the key metrics I need to gather from apache and mysql other than like: Disk I/O operations, Data up/down avgs and trends and special high usage periods/scenarios? I've reviewed AWS/EC2 fine print, but want 2nd opinions. What other services aside from the basic web/database have proven valuable to you? I know nothing of Hadoop or many other technologies they offer, echoing my prev. question, do you sometimes find it worth it (Initially having it be a gamble aside from basic homework) to dive/break into a whole new environment and try to/or end up finding a way of more efficiently producing your data/site product? Anything I should watch out for in projecting costs, or any other general advice when working with AWS folks from anyone else where your company is very niche and very very technical (Scientifically - or anybody for that matter)? Thanks very much for your input - I think this thread could be valuable to others as well.

    Read the article

  • Openfire on EC2 with Jingle

    - by Bjorn Roche
    I would like to run Openfire (or another XMPP server) on EC2. At the moment this is just for testing, so easy setup and configuration are important, as is low cost. At some point, however, if things go well, it will be important to scale this. Ideally, it would be nice to not have to switch software when the scaling happens, but if a switch needs to happen later it certainly can. My requirements are: basic XMPP services, including muc and pubsub. Logins controlled from an external API. Preferably, when a user attempts to connect, the XMPP server checks with the api to see if their username and password are correct, but I can also have the API keep the XMPP server up to date on new users, deleted users, pasword changes and so on. I see Openfire has a "user service" API. Not ideal, but it looks workable. Jingle, including relay and STUN. It's not at all clear to me if the Jingle Nodes plugin takes care of this. I'm a bit confused about what's required to set this up, and I'd rather know in advance than be confused along the way :). eg It seems like STUN servers require more than one IP address. Can Openfire do all this for me, including stun and media relay on a single machine? Is this hard to configure on EC2 with Openfire? What are the basic steps? Would this be easier with something else like, say Tigase? What about database? Should I use amazon's database service, or run a db on the same machine? Would the server be compatible with a service like http://www.siteuptime.com/ Thanks!

    Read the article

  • Connecting to RDS database from EC2 instance using bind9 CNAME alias

    - by mptre
    I'm trying to get internal DNS up and running on a EC2 instance. The main goal is to be able to define CNAME aliases for other AWS services. For example: Instead of using the RDS endpoint, which might change over time, an alias mysql.company.int can be used instead. I'm using bind9 and here's my config files: /etc/bind/named.conf.local zone "company.int" { type master; file "/etc/bind/db.company.int"; }; /etc/bind/db.company.int ; $TTL 3600 @ IN SOA company.int. company.localhost. ( 20120617 ; Serial 604800 ; Refresh 86400 ; Retry 2419200 ; Expire 604800 ) ; Negative Cache TTL ; @ IN NS company.int. @ IN A 127.0.0.1 @ IN AAAA ::1 ; CNAME mysql IN CNAME xxxx.eu-west-1.rds.amazonaws.com. The dig command ensures me my alias is working as excepted: $ dig mysql.company.int ... ;; ANSWER SECTION: mysql.company.int. 3600 IN CNAME xxxx.eu-west-1.rds.amazonaws.com. xxxx.eu-west-1.rds.amazonaws.com. 60 IN CNAME ec2-yyy-yy-yy-yyy.eu-west-1.compute.amazonaws.com. ec2-yyy-yy-yy-yyy.eu-west-1.compute.amazonaws.com. 589575 IN A zzz.zz.zz.zzz ... As far as I can understand a reverse zone isn't needed for a simple CNAME alias. However when I try to connect to MySQL using my newly created alias the operation is giving me a timeout. $ mysql -uuser -ppassword -hmysql.company.int ERROR 2003 (HY000): Can't connect to MySQL server on 'mysql.company.int' (110) Any ideas? Thanks in advantage!

    Read the article

  • Is there a way to determine which service (in svchost.exe) does an outgoing connection?

    - by fluxtendu
    I'm redoing my firewall configuration with more restrictive policies and I would like to determine the provenance (and/or destination) of some outgoing connections. I have an issue because they come from svchost.exe and go to web content/application delivery providers - or similar: 5 IP in range: 82.96.58.0 - 82.96.58.255 --> Akamai Technologies akamaitechnologies.com 3 IP in range: 93.150.110.0 - 93.158.111.255 --> Akamai Technologies akamaitechnologies.com 2 IP in range: 87.248.194.0 - 87.248.223.255 --> LLNW Europe 2 llnw.net 205.234.175.175 --> CacheNetworks, Inc. cachefly.net 188.121.36.239 --> Go Daddy Netherlands B.V. secureserver.net So is it possible to know which service does a particular connection? Or what's your recommendation about the rules applied to these ones? (Comodo Firewall & Windows 7) Update: netstat -ano & tasklist /svc help me a little but they are many services in one svchost.exe so it's still an issue. moreover the service names returned by "tasklist /svc" are not easy readable. (All the connections are HTTP (port 80) but I don't think it's relevant)

    Read the article

  • Recommendations for colocation in the US

    - by Emil
    Hello serverfault I work for a European media company and we are currently looking for colocation in the US. I know the European market quite well unfortunately that is not the case for the US. I'm hoping for you guys to help me out a bit with a few questions, it would be much appreciated! I am looking for a data center that can deliver a high level of availability (tier 3 or better). The installation will be fairly large so capacity is important. Good internet connectivity/carrier presence. However most important is good customer support, skilled dedicated and responsive technical staff, since we won't have tech staff close by. I'm looking for a small and fast moving company that target internet businesses rather than big old enterprise hosting. What locations should we go for given that we want to reach all of the US from a single site and still maintain decent latency? (do we need east and west coast?) Where are the main internet hubs and should you try and get as close as possible? Are there any good online resources I should look at? Where do the large scale internet/media services colocate? Lastly I would be very happy to get some actual recommendations for companies to talk to P.S I'm happy to return the favor if anyone has question regarding data centers and colocation in Europe.

    Read the article

  • Migrating from Exchange 2003 to 2010 UID changes from 32 characters to 64 characters

    - by Seth
    We have built a custom CRM tool that integrates with Exchange 2010 using Exchange Web Services. The issue we are encountering revolves around editing appointments through the CRM tool that were created in exchange 2003. We have migrated the sales staff from Exchange 2003 to 2010 so that we could use EWS. EWS works great except for appointments that were created prior to the migration. Those appointments created prior to the migration in Exchange 2003 cannot be modified using EWS. The reason is that the ExchangeItemUID for the appointment changed from 32 characters to 64 characters. EWS does not recognize ExchangeItemUIDs that are 32 characters. We are looking for a solution that will allow us to modify these appointments. We are open to ideas of running a script that will update all appointment events for the sales people so that 2003 appointments are converted to 2010 format. We are also open to alternate IDs as opposed to using UID. I have seen some references to using CleanGlobalObjectID, but I don't see that property in EWS. Has anyone encountered this problem before? Any help you could give would be greatly appreciated!

    Read the article

  • Problem with network after malware attack

    - by Cruelio
    Im trying to help some friends with a Win XP machine. I got rid of the malware using Malware Bytes, and HiJackThis. But now they(I) have another problem. When the computer boot into Windows it seems fine. When I start Internet Explorer the browser window opens just fine, but nothing happens for at minute or two. After the two minutes of waiting, the network icon appears in the taskbar next to the clock, and then everything works. The computer is connected to the internet using a Ethernet adapter. I have looked at the Rvent Log and found an error from Perfnet with eventid 2004 <Provider Name="PerfNet" /> <EventID Qualifiers="49152">2004</EventID> <Level>2</Level> <Task>0</Task> <Keywords>0x80000000000000</Keywords> What I have tried so far: In the device manager i have uninstalled the Ethernet adapter and installed it again. I have uninstalled and installed the Windows File and Printer Sharing service. I have verified that both server and workstation services are started. What should I do next?

    Read the article

  • Windows: How to make programs think they're not running in a terminal server session?

    - by sinni800
    I am using the program "SoftXPand 2011 Duo" by Miniframe on my Windows 7 PC. It makes two workstations out of one computer. It uses the terminal services built into Windows to create the additional session. I use two screens, two keyboards and two mice to create this "illusion" of two computers. It works quite well and I can even play two different 3D games on the two screens attached to this single machine (using a Radeon HD5770 and a Core i5 2500k with 8 Gbytes RAM). There are a few downsides to this. I just found about one that is hidden on the first look. The sessions you are in (even on the first workstation) will identify as a terminal server session! Now some programs will run with limited effects (graphical), and some won't run at all. This also resulted in some games not running at all. They just say "Cannot be run in a terminal server session" and exit. I have already proven that top modern games (DirectX 10, 11) run just as good as on the same machine without SoftXPand, so this is a pretty artificial limitation! So, can I somehow hack my current session so it doesn't look like a terminal server session anymore? I. E. #include <windows.h> #pragma comment(lib, "user32.lib") BOOL IsRemoteSession(void) { return GetSystemMetrics( SM_REMOTESESSION ); } Will return FALSE? (Not a programming question! Just an example how programs detect if they're in a terminal server session!)

    Read the article

  • set service dependency on internet connection

    - by nccsbim071
    Hi, I have created a window service and set some dependencies like on MSMQ, MSSQLSERVER and so. Everything is working nice. but i need to send another dependency for my service. That is on internet connection. My service is responsible for sending emails. As soon my server starts, my service starts too and it finds if there is anything to send, if there is, it starts to send email, if during sending it is not able to connect to the internet it cannot send email. so i guess i should set my service dependency on internet connection too. I already set my window service dependency to MicrosoftSQL Server and Microsoft Message Queuing by editing the registry value. by adding new multi string value named "DependOnService", Type "REG_MULTI_SZ" and space separated names of the services that my service depends upon for the Data. For Microsoft SQL Server i set the value to "MSSQLSERVER" but i don't know the name of the internet service that i need to set dependency upon. how can i do this, any help please Thanks

    Read the article

  • Software to monitor bill payment to mission critical IT service providers (ISP, DNS etc.)

    - by Sholom
    Hi All, The Problem: Our very likable but absent minded bookkeeper keeps neglecting to pay our IT vendors on time. Just this past week our internet service was disconnected. Same could happen to many other mission critical accounts (domain registrar, backup MX, anti-virus license, HackerSafe (McAfee secure) service and even an 800 number to name a few). As the sysadmin, i monitor my severs to make sure they are plugged into the power-outlet. I believe i should also monitor my services to make sure they are plugged in to their money-outlet. To compound the problem, when the power goes out someone else will likely notice and notify me. But if a bill is not payed, no one will ever notice until service is lost. Lost as in losing our domain name which would cause a lot more damage then the power failing on our server. [Solution] = [Doesn't work because]: Retrain the bookkeeper = Wishful thinking. Notify my manager = Already have (via email). Protects me, does not solve problem. Fire bookkeeper = What makes you so sure the next one will never forget? Bottom line: Humans are humans and sooner or later something critical will be royally messed up. We need to partner with a machine to help us out here. Anybody have the same problem? What software/solution do you use? I would like software that emails me when a bill is passed due just like i get an email when the power outlet fails. Anyone hear of anything like that? Thanks

    Read the article

  • AWS own email domain and some generic questions

    - by John Brunner
    I'm getting started with Amazon Web Services and I have a few question I'm not sure about. As every (company) webpage I want to use an "[email protected]" email adress, but how is that done? I looked up at godaddy.com (for domain registration), the offer me an email adress like I want, but for 3 dollars per month. Is this possible with AWS? Because at AWS you have just a complex domain which is not very userfriendly or serious. Also I want to host my dynamic webpage on the amazon cloud, but I'm not sure if I'm doing that right. I've read many guides, and all I know is that I have to purchase a Elastic Compute Cloud, and a Simple Storage Service... and every guide is working with the basic linux package, why not Windows? Is it more expensive? I just want to host a mySQL Server for the dynamic webpage, which is reached over a normal domain. And one last question, if I sign up for an AWS account it asks me for an email account. But I found it a little bit unserious to write there my free-webmailer-adress... How is it done the normal way? Thanks in advance! Best regards, john.

    Read the article

  • Map a URL bought with Dreamhost to Amazon EC2 (AWS)

    - by Edan Maor
    I have several URLs I purchased through Dreamhost. I'm starting to use Amazon's AWS, and I'd like to map the URLs to Amazon. This is something of a silly question, and I've already done the same thing several times to other services (mapping from Dreamhost to WebFaction). But for some reason when I tried to find the proper way to do the same mapping to Amazon, I find a lot of detailed writing talking about whether I should be using CNAME or A records, etc. So I wanted to ask in the simplest possible terms and hopefully get a simple, concrete answer: I bought a URL from Dreamhost, I have an EC2 server running on AWS (to which I already mapped an Elastic IP address). How do I make the URL map to AWS? And if there are several options, which one should I effectively be using? P.S. Meta-question - why are things so much more difficult with AWS? When I search Google for "Move from Dreamhost to WebFaction, I get very simple answers on how to do the mapping. In what way is AWS different?

    Read the article

  • Performance data collection for short-running, ephemeral servers

    - by ErikA
    We're building a medical image processing software stack, currently hosted on various AWS resources. As part of this application, we have a handful of long-running servers (database, load balancers, web application, etc.). Collecting performance data on those servers is quite simple - my go-to- recipe of Nagios (for monitoring/notifications) and Munin (for collection of performance data and displaying trends) will work just fine. However - as part of this application, we are constantly starting up and terminating compute instances on EC2. In typical usage, these compute instances start up, configure themselves, receive a job from a message queue, and then get to work processing that job, which takes anywhere from 15 minutes to over 8 hours. After job completion, these instances get terminated, never to be heard from again. What is a decent strategy for collecting performance data on these short-lived instances? I don't necessarily need monitoring on them - if they fail for whatever reason, our application will detect this and handle re-starting the job on another instance or raising the flag so an administrator can take a look at things. However, it still would be useful to collect information like CPU (user, idle, iowait, etc.), memory usage, network traffic, disk read/write data, etc. In our internal database, we track the instance ID of the machine that runs each job, and it would be quite helpful to be able to look up performance data for a specific instance ID for troubleshooting and profiling. Munin doesn't seem like a great candidate, as it requires maintaining a list of munin nodes in a text file - far from ideal for an environment with a high amount of churn, and for the short amount of time each node will be running, I'd rather keep the full-resolution data indefinitely than have RRD water down the data over time. In the end, my guess is that this will require a monitoring engine that: uses a database (MySQL, SQLite, etc.) for configuration and data storage exposes an API for adding/removing hosts and services Are there other things I should be thinking about when evaluating options? Perhaps I'm over-thinking this, though, and just ought to run sar at 1-minute intervals on these short-lived instances and collect the sar db files prior to termination.

    Read the article

  • what is the reason behind window service stopped ,whether its due to LAN problems or any other issues

    - by Steve
    I have a windowservice which named Trunk which stopped one day i just want to know the reason behind it? this is an entry in the logs, Nov 15 17:54:04.318 :Trunk-1516:Trunk:handle_control_event:Received CTRL_LOGOFF_EVENT, ignore it Nov 25 15:54:52.157 :Trunk-1516:Trunk:ERROR - Process Restart Count (5) Exceeded for:C:\Program Files\secon\11.1.4\bin\vmd Nov 25 15:54:52.157 :Trunk-1516:Trunk:Stopping Trunk ... Nov 25 15:54:52.314 :Trunk-1516:Trunk:Shutting down, signaled C:\Program F Nov 20 15:54:20.345 :SCBridge.RegisterBridge:Exception in method: ScUtility.ScCommandException (0xa08990002): Exception from HRESULT: 0xa08990002 Supplemental Information: None available. at ScServer.ScServiceProcessorRegistryManager.Attach(String serviceProcessor, ScClientInformation clientInfo, FORCE_ATTACH_SPEC forceAttachToMaster) at ScServer.ScServiceProcessorRegistry.Attach(String serviceProcessor, Object clientInfo) at ScServer.ScServiceProcessorRegistry.Attach(String serviceProcessor) at ServerControlInterface.SCBridge.RegisterBridge(String SPName) for system APOLLOSP0 attempting to attach and register with the Bridge i had seen service is registered with specific account, so i thought that user logged off from the machine that may be the reason behind it or any LAN disconnection problem . But Having taken another look at the above entry we seem to have a constant failure being generated in vmd which causes Trunk to detect vmd requires a restart. Most of the time it works OK and the restart count is anything up to 4. In this case the Trunk log confirms that the Restart Count is 5 and so is considered to be exceeded. Presumably, this triggers the termination of the other services and Trunk is actually doing its job.So, coould this just be a timing issue and we need to increase the tolerance level (i.e restart count) or do we need to address the 0xa08990002 error in vmd?

    Read the article

  • Windows service fails to start with local user until password is entered again in logon tab

    - by Nick
    Basically we have a service where we use a local account as its logon. it has all the proper permissions, and everything is working fine, service starts and runs and all is good. Then one day, after rebooting, the service fails to start. Logs show incorrect password. Our technicians resolve the issue by simply retyping the password into the "Log On" tab from the services.msc. Unfortunately we have not been able to root cause. I suspect that the password that is stored for the service is lost somehow. Does anyone know where the password hash might be stored so we can check it? The only activities that seem to be possibly related are patching with Microsoft security patches, but we have multiple servers running the same service, and we have never seen more than one at a time, and its usually a different one each time when this occurrs. I believe this to be the same issue as this: Windows service fails to start with custom user until started once with local user But i was unable to add comments, and its really old.

    Read the article

  • Get the Windows Scheduler's Location Or Code?

    - by Ram
    Today i got a task to find the scheduler that is running once in a month, actually I have to change its running time period to once a week. I have searched for it in the Windows scheduled tasks, but i didn't find it. It is sending a mail containing a link. Now i am not confused about this where else can i find it. As the place i know where the scheduler can be, has already been checked by me. Can someone suggest me where else can i search for this scheduler? As per the previous developer's comment "It is a normal Windows method of scheduling tasks" UPDATE Actually the task is running after a month periodically. As per my knowledge it could be a "Windows Task Schedule" or "Windows Service" created by the old developer. Now as the previous developer is not available and i do not have any documentation.. i need to change the time period from month to weakly. I have checked in the "Task Schedules" on the server and checked the services running ob the server and was unable to find the "Scheduler". now i have two questions: is there any other approach by using that i can schedule an automated email periodically. any idea to find this.

    Read the article

  • What is the correct approach i should use for an application that requires amazon S3 uploads and SimpleDB data management?

    - by Luis Oscar
    I am developing an application for iOS and that is going smoothly, the problem is that I am very new at server sided things. I am totally confused about how to correctly use Amazon Web Services for this purpose. What I want to do is very simple. I want my application to be able to query a servlet hosted in EC2 to be able to retrieve pictures and data based on some criteria from S3 and SImpleDB respectively. Also the application should be able to upload pictures into a S3 bucket and register the information in the SImpleDB. My main concerns are security and costs, So far i was using Amazon Token Vending Machine but I haven't been successful when trying to customize it, and while researching I discovered that on the long run it is very expensive. The ultimate goal is to handle a "social" picture service for my iOS application. Being able to register new users, authenticate these users. See what permissions they have to which pictures from the bucked. And all this without having to worry about Third party people from accessing the private pictures of my users. Sorry for this question but I am really clueless about how to handle this... I have tried reading many articles but all these server stuff looks very scary.

    Read the article

  • MSBuild Script and VS2010 publish apply Web.config Transform

    - by Jason
    So, I have VS 2010 installed and am in the process of modifying my MSBuild script for our TeamCity build integration. Everything is working great with one exception. How can I tell MSBuild that I want to apply the Web.conifg transform files that I've created when I publish the build... I have the following which produces the compiled web site but, it outputs a Web.config, Web.Debug.config and, Web.Release.config files (All 3) to the compiled output directory. In studio when I perform a publish to file system it will do the transform and only output the Web.config with the appropriate changes... <Target Name="CompileWeb"> <MSBuild Projects="myproj.csproj" Properties="Configuration=Release;" /> </Target> <Target Name="PublishWeb" DependsOnTargets="CompileWeb"> <MSBuild Projects="myproj.csproj" Targets="ResolveReferences;_CopyWebApplication" Properties="WebProjectOutputDir=$(OutputFolder)$(WebOutputFolder); OutDir=$(TempOutputFolder)$(WebOutputFolder)\;Configuration=Release;" /> </Target> Any help would be great..! I know this can be done by other means but I would like to do this using the new VS 2010 way if possible

    Read the article

  • How to update data in the user information list when using FBA

    - by Flo
    I've got to support a SharePoint web application which uses FBA with a custom membership and a custom role provider to authenticate the user against two different LDAPs. The user data are only stored in the user information lists. The SSP user profiles are not used. Now one of the users got married and therefore her surname got changed in the LDAP (the one where her information are stored). But this change doesn't get provisioned into the user information list. I wondering what option I have to provision changes of user data to the user information list. I've already tried to update the last name of the user manually, but it seems as if certain information like surname, first name are not editable in the user information list. I tried to edit them as a site administrator. So what option do I have to solve this problem? Being able to edit the information per hand would also be a solution but of course not the most preferred one.

    Read the article

  • Programmatically set a TaxonomyField on a list item

    - by Mel Gerats
    The situation: I have a bunch of Terms in the Term Store and a list that uses them. A lot of the terms have not been used yet, and are not available yet in the TaxonomyHiddenList. If they are not there yet they don't have an ID, and I can not add them to a list item. There is a method GetWSSIdOfTerm on Microsoft.SharePoint.Taxonomy.TaxonomyField that's supposed to return the ID of a term for a specific site. This gives back IDs if the term has already been used and is present in the TaxonomyHiddenList, but if it's not then 0 is returned. Is there any way to programmatically add terms to the TaxonomyHiddenList or force it happening?

    Read the article

  • Why am I getting: InvalidOperationException: Failed to map the path '/app42/App_Code/'.

    - by serialhobbyist
    I've been working on a little Silverlight utility which calls a Silverlight web service. It works from my dev machine (XPsp2). I've tried publishing it to a 2008 R2 IIS 7.5 server and it doesn't work when trying to contact the web service. So I've tried using the WcfTestClient to connect to the web service. That gave an error. So I turned off CustomErrors and used IE and I get the following. Any idea why? There's no App_Code folder in the app. Failed to map the path '/app42/App_Code/'. Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.InvalidOperationException: Failed to map the path '/app42/App_Code/'. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. Stack Trace: [InvalidOperationException: Failed to map the path '/app42/App_Code/'.] System.Web.Configuration.ProcessHostConfigUtils.MapPathActual(String siteName, VirtualPath path) +320 System.Web.Configuration.ProcessHostServerConfig.System.Web.Configuration.IServerConfig.MapPath(IApplicationHost appHost, VirtualPath path) +34 System.Web.Hosting.MapPathBasedVirtualPathEnumerator..ctor(VirtualPath virtualPath, RequestedEntryType requestedEntryType) +169 System.Web.Hosting.MapPathBasedVirtualPathCollection.System.Collections.IEnumerable.GetEnumerator() +43 System.Web.Compilation.CodeDirectoryCompiler.ProcessDirectoryRecursive(VirtualDirectory vdir, Boolean topLevel) +147 System.Web.Compilation.CodeDirectoryCompiler.GetCodeDirectoryAssembly(VirtualPath virtualDir, CodeDirectoryType dirType, String assemblyName, StringSet excludedSubdirectories, Boolean isDirectoryAllowed) +11196502 System.Web.Compilation.BuildManager.CompileCodeDirectory(VirtualPath virtualDir, CodeDirectoryType dirType, String assemblyName, StringSet excludedSubdirectories) +185 System.Web.Compilation.BuildManager.CompileCodeDirectories() +654 System.Web.Compilation.BuildManager.EnsureTopLevelFilesCompiled() +658 [HttpException (0x80004005): Failed to map the path '/app42/App_Code/'.] System.Web.Compilation.BuildManager.ReportTopLevelCompilationException() +76 System.Web.Compilation.BuildManager.EnsureTopLevelFilesCompiled() +1012 System.Web.Hosting.HostingEnvironment.Initialize(ApplicationManager appManager, IApplicationHost appHost, IConfigMapPathFactory configMapPathFactory, HostingEnvironmentParameters hostingParameters) +1025 [HttpException (0x80004005): Failed to map the path '/app42/App_Code/'.] System.Web.HttpRuntime.FirstRequestInit(HttpContext context) +11301302 System.Web.HttpRuntime.EnsureFirstRequestInit(HttpContext context) +88 System.Web.HttpRuntime.ProcessRequestNotificationPrivate(IIS7WorkerRequest wr, HttpContext context) +4338644 Version Information: Microsoft .NET Framework Version:2.0.50727.4927; ASP.NET Version:2.0.50727.4927

    Read the article

  • Restlet vs Spring MVC for Restful web service

    - by zachariahyoung
    I'm researching how best to create a Restful web service on Google app engine. My end goal is to have an Android application call a web service on GAE to post and get data. At this point I not sure what the best approach is. What I know at this point is Spring MVC 3 provide the ability to create web service but it does not provide a full implementation of JAX-RS. I also have read a few blog that talk about how Spring and Restlet can be integrated together. On the other side I have read that I could only use Restlet in GAE. I would also like provide a light web interface for users to view their posted data So my questions are the following. 1. Should I just use Restlet. 2. Should I just use Spring MVC to provide my Restful web service. 3. Should I use Spring and Restlet together. At this point I think I should invest my time in Restlet because that seems to be the best approach for calling web services in Android. I'm also debating if Spring MVC is just over kill. Any thoughts would be helpful.

    Read the article

  • SPWeb ProcessBatchData

    - by BeraCim
    Hi all: I'm currently using SPWeb's ProcessBatchData to update a lot of list items/folders into Sharepoint database. For example, in one batch update I have ~1200 objects to update. I also run it numerous times during different phases in my app. I found that it is generally faster and more efficient in updating large amount of items. But lately the performance of the app has decreased as more processes get added to the app (and also more batch updates). I was wondering whether the ProcessBatchData would be a suspect in slowing down the app? E.g. temporarily locking up db, draining resources, lagging network, etc. Thanks.

    Read the article

  • IIS: No Session being handed out, but only in production

    - by Wayne
    I've reproduced this in a simple project - details below. It's a WCF service in ASP.NET compatibility mode. What I'm seeing is that when run on the dev machine (Win7), a HTTP session id is available inside the service operation (HttpContext.Current.Session is non-null). But when deployed to the server (Win2k8R2), I get "No session". On both machines the app is configured to use the classic app pool, and the app pools themselves are configured identically as far as I can tell. The only differences I can discern between the two applications is that on the dev box, under "Handler Mappings", ISAPI-dll is disabled (not on the server), and on the server there's a spurious handler called "AboMapperCustom-7105160" (does not exist on the dev box). What should I be looking at next? Am I missing something head-slappingly simple? Service is this: [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Allowed)] public class Service2 { [OperationContract] public string DoWork() { if (HttpContext.Current != null) { if (HttpContext.Current.Session != null) { return "SessionId: " + HttpContext.Current.Session.SessionID; } else { return "No Session"; } } else { return "No Context"; } } } Config is: <?xml version="1.0" encoding="UTF-8"?> <configuration> <configSections> <section name="log4net" type="log4net.Config.Log4NetConfigurationSectionHandler,log4net, Version=1.2.9.0, Culture=neutral, PublicKeyToken=b32731d11ce58905" /> <sectionGroup name="system.web.extensions" type="System.Web.Configuration.SystemWebExtensionsSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <sectionGroup name="scripting" type="System.Web.Configuration.ScriptingSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <section name="scriptResourceHandler" type="System.Web.Configuration.ScriptingScriptResourceHandlerSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication" /> <sectionGroup name="webServices" type="System.Web.Configuration.ScriptingWebServicesSectionGroup, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35"> <section name="jsonSerialization" type="System.Web.Configuration.ScriptingJsonSerializationSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="Everywhere" /> <section name="profileService" type="System.Web.Configuration.ScriptingProfileServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication" /> <section name="authenticationService" type="System.Web.Configuration.ScriptingAuthenticationServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication" /> <section name="roleService" type="System.Web.Configuration.ScriptingRoleServiceSection, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" requirePermission="false" allowDefinition="MachineToApplication" /> </sectionGroup> </sectionGroup> </sectionGroup> </configSections> <log4net> <appender name="LogFile" type="log4net.Appender.RollingFileAppender"> <file value="C:\Temp\Test.log4net.log" /> <rollingStyle value="Once" /> <maxSizeRollBackups value="10" /> <layout type="log4net.Layout.PatternLayout"> <conversionPattern value="%d{ISO8601} [%5t] %-5p %c{1} %m%n" /> </layout> </appender> <root> <level value="DEBUG" /> <appender-ref ref="LogFile" /> </root> </log4net> <appSettings /> <connectionStrings /> <system.web> <compilation debug="true"> <assemblies> <add assembly="System.Core, Version=3.5.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089" /> <add assembly="System.Data.DataSetExtensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089" /> <add assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add assembly="System.Xml.Linq, Version=3.5.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089" /> </assemblies> </compilation> <!-- The <authentication> section enables configuration of the security authentication mode used by ASP.NET to identify an incoming user. --> <authentication mode="Windows" /> <!-- The <customErrors> section enables configuration of what to do if/when an unhandled error occurs during the execution of a request. Specifically, it enables developers to configure html error pages to be displayed in place of a error stack trace. --> <customErrors mode="RemoteOnly" defaultRedirect="GenericErrorPage.htm"> <error statusCode="403" redirect="NoAccess.htm" /> <error statusCode="404" redirect="FileNotFound.htm" /> </customErrors> <pages> <controls> <add tagPrefix="asp" namespace="System.Web.UI" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add tagPrefix="asp" namespace="System.Web.UI.WebControls" assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </controls> </pages> <httpHandlers> <remove verb="*" path="*.asmx" /> <add verb="*" path="*.asmx" validate="false" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add verb="*" path="*_AppService.axd" validate="false" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add verb="GET,HEAD" path="ScriptResource.axd" type="System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" validate="false" /> </httpHandlers> <httpModules> <add name="ScriptModule" type="System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </httpModules> </system.web> <system.codedom> <compilers> <compiler language="c#;cs;csharp" extension=".cs" warningLevel="4" type="Microsoft.CSharp.CSharpCodeProvider, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089"> <providerOption name="CompilerVersion" value="v3.5" /> <providerOption name="WarnAsError" value="false" /> </compiler> </compilers> </system.codedom> <!-- The system.webServer section is required for running ASP.NET AJAX under Internet Information Services 7.0. It is not necessary for previous version of IIS. --> <system.webServer> <validation validateIntegratedModeConfiguration="false" /> <modules> <remove name="ScriptModule" /> <add name="ScriptModule" preCondition="managedHandler" type="System.Web.Handlers.ScriptModule, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </modules> <handlers> <remove name="WebServiceHandlerFactory-Integrated" /> <remove name="ScriptHandlerFactory" /> <remove name="ScriptHandlerFactoryAppServices" /> <remove name="ScriptResource" /> <add name="ScriptHandlerFactory" verb="*" path="*.asmx" preCondition="integratedMode" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add name="ScriptHandlerFactoryAppServices" verb="*" path="*_AppService.axd" preCondition="integratedMode" type="System.Web.Script.Services.ScriptHandlerFactory, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> <add name="ScriptResource" preCondition="integratedMode" verb="GET,HEAD" path="ScriptResource.axd" type="System.Web.Handlers.ScriptResourceHandler, System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35" /> </handlers> </system.webServer> <runtime> <assemblyBinding appliesTo="v2.0.50727" xmlns="urn:schemas-microsoft-com:asm.v1"> <dependentAssembly> <assemblyIdentity name="System.Web.Extensions" publicKeyToken="31bf3856ad364e35" /> <bindingRedirect oldVersion="1.0.0.0-1.1.0.0" newVersion="3.5.0.0" /> </dependentAssembly> <dependentAssembly> <assemblyIdentity name="System.Web.Extensions.Design" publicKeyToken="31bf3856ad364e35" /> <bindingRedirect oldVersion="1.0.0.0-1.1.0.0" newVersion="3.5.0.0" /> </dependentAssembly> </assemblyBinding> </runtime> <system.serviceModel> <bindings> <basicHttpBinding> <binding name="BasicHttpBinding_Service2" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647"> <security mode="TransportCredentialOnly"> <transport clientCredentialType="Windows" /> </security> </binding> </basicHttpBinding> </bindings> <serviceHostingEnvironment aspNetCompatibilityEnabled="true" /> <behaviors> <serviceBehaviors> <behavior name="WebApplication3.Service2Behavior"> <serviceMetadata httpGetEnabled="true" /> <serviceDebug includeExceptionDetailInFaults="false" /> </behavior> </serviceBehaviors> </behaviors> <services> <service behaviorConfiguration="WebApplication3.Service2Behavior" name="WebApplication3.Service2"> <endpoint address="" binding="basicHttpBinding" bindingConfiguration="BasicHttpBinding_Service2" contract="WebApplication3.Service2" /> </service> </services> </system.serviceModel> <system.diagnostics> <sources> <source name="System.ServiceModel" switchValue="Information, ActivityTracing" propagateActivity="true"> <listeners> <add name="traceListener" type="System.Diagnostics.XmlWriterTraceListener" initializeData="c:\Temp\Test2.svclog" /> </listeners> </source> </sources> <trace autoflush="true" indentsize="4"> <listeners> <add name="traceListener2" type="System.Diagnostics.TextWriterTraceListener" initializeData="c:\Temp\Test.log" traceOutputOptions="DateTime" /> </listeners> </trace> </system.diagnostics> </configuration> Testing with a simple console app: class Program { static void Main(string[] args) { ServiceReference1.Service2Client client = new ServiceReference1.Service2Client(); Console.WriteLine(client.DoWork()); Console.ReadKey(); } }

    Read the article

  • Making an AJAX WCF Web Service request during an Async Postback

    - by nekno
    I want to provide status updates during a long-running task on an ASP.NET WebForms page with AJAX. Is there a way to get the ScriptManager to execute and process a script for a web service request during an async postback? I have a script on the page that makes a web service request. It runs on page load and periodically using setInterval(). It's running correctly before the async postback is initiated, but it stops running during the async postback, and doesn't run again until after the async postback completes. I have an UpdatePanel with a button to trigger an async postback, which executes the long-running task. I also have an instance of an AJAX WCF Web service that is working correctly to fetch data and present it on the page but, like I said, it doesn't fetch and present the data until after the async postback completes. During the async postback, the long-running task sends updates from the page to the web service. The problem is that I can debug and step through the web service and see that the status updates are correctly set, but the updates aren't retrieved by the client script until the async postback completes. It seems the Script Manager is busy executing the async postback, so it doesn't run my other JavaScript via setInterval() until the postback completes. Is there a way to get the Script Manager, or otherwise, to run the script to fetch data from the WCF web service during the async postback? I've tried various methods of using the PageRequestManager to run the script on the client-side BeginRequest event for the async postback, but it runs the script, then stops processing the code that should be running via setInterval() while the page request executes.

    Read the article

< Previous Page | 379 380 381 382 383 384 385 386 387 388 389 390  | Next Page >