Search Results

Search found 9816 results on 393 pages for 'blade servers'.

Page 294/393 | < Previous Page | 290 291 292 293 294 295 296 297 298 299 300 301  | Next Page >

  • Development enviroment for .NET

    - by user137348
    I'm about to create an development enviroment for a little developer company(3-4 developers + some testers). Our development platform is .NET and Oracle. My question is how to structure the whole enviroment.How many servers do I need ? Should be one server for developers and one for testers ? I'd like to have one build server (TeamCity). Where to put the Subversion ? Visual Studio's would be on developers laptops. Do I need one database for development and one extra for build server ? What else could be helpful ??

    Read the article

  • Lookup function not working (RS SP2)

    - by Al Reyes
    Hi, I made the upgrade to SP2. I'm trying to use the Lookup function to link data from two different servers. I'm trying first a simple exercise linking data from two datasets from the same server, having one dataset with journals and the other with the account description. My Expression looks like this at a field on the table I have: =Lookup(Fields!ACTINDX.Value,Fields!ACTINDX.Value,Fields!ACTDESCR.Value,"ACCTINFO") I made sure of the names and using only uppercases for datasets and fields but I'm receiving the following message when I try to preview: "An error occurred during local report processing. The definition of the report '/DETAIL' is invalid. The Value expression for the text box 'ACTINDX' refers to the field 'ACTDESCR'. Report item expressions can only refer to fields within the current dataset scope or, if inside an aggregate, the specified dataset scope". I'll appreciate any suggestions. Regards, Al

    Read the article

  • Weblogic 10.3 domain unpacking problem

    - by MarkoU
    Hi, I'm trying to unpack a Weblogic 10.3 domain on one of our production servers (SunOS 5.10), but get the following error: $ /opt/bea10/wlserver_10.3/common/bin/unpack.sh -template=/tmp/CM.jar -domain=/opt/bea10/user_projects/CM Error: failed to create the temporary script file Assuming that this is a priviledge problem: where actually the unpack utility tries to create its temporary script files? The unpack script calls a Java class com.bea.plateng.domain.script.Unpacker, so reading the script itself does not reveal the location. I need to ask the sysadmin for the priviledges, so an exact directory location is needed. Of course, the error message is so vague that this might also be some other issue. Any ideas? BR, Marko P.S. Sorry for cross-posting. I tried this question also on Serverfault but got no replies. Perhaps programmers (like myself) do this kind of stuff anyway.

    Read the article

  • Which application server should i choose for my project ?

    - by Dimitri
    Hi folks, I am currently developing an application for some researchers in my university.It's a small java program that you can use by command line. The next step is to package that program and deploy it to an application server. Some clients program will submit requests to the server who will call the tool that I wrote. Lately, we will add more tools to the server and he has to dispatch the requests to the right tool. Which application server fits my needs ? I have looked for Tomcat, Jetty and Glassfish but it seems that they are only used for web application. Is it possible to use those servers in some context different from web context? Which package archive should i use (jar, war) ? Any advice?

    Read the article

  • COM+ Applications returns an error when when I try to add a new application

    - by Baright
    Error message returned 'An error occurred while processing the last operation. Error code 800401154 - Class not registered The event log may contain additional troubleshooting information.' The other thing is, the is a red arrow icon displayed over My Computer (Component Servers - Computers - My Computer). I have searched everywhere for this, but I couldnt find a solution that resolves my specific problem. Im using VISTA and this error started after I reinstalled my SQL server 2008. I have 2 dll that I need to add but I cannot, because of this error.

    Read the article

  • Capistrano SSH::AuthenticationFailed, not prompting for password

    - by Sparkmasterflex
    I've been using capistrano successfully for a while now and all of a sudden in every project I've lost the ability to deploy. Environment: os X (Mavericks) ruby 1.9.3p194 rvm (locally, not on server) rails 3.2 and up RubyGems 1.8.25 I'm not using rsa_keys or anything I want capistrano to prompt for user and password. Suddenly it has decided not to ask for a password, but does ask for user. Then it rolls back and gives me the following error. [deploy:update_code] exception while rolling back: Capistrano::ConnectionError, connection failed for: sub.example.com (Net::SSH::AuthenticationFailed: Authentication failed for user [email protected]) connection failed for: sub.example.com (Net::SSH::AuthenticationFailed: Authentication failed for user [email protected]) This has occurred on my personal laptop and my iMac at work. It occurs when deploying to two different servers (both linux) I'm completely at a loss here. Any ideas?

    Read the article

  • Mysql Master Slave Replication on Large Database table (how to sync initial data)

    - by Brian Lovett
    We have a production server and a dev server. We have found that backups are nearly impossible on the production server because of the query volume we experience. So, we're looking at setting up replication with our dev server being the slave. This is ideal because we can afford to lock the tables on that server and additionally it will be nice to have up to date data for the developers. Now, the issues. The production server can't really be taken down or locked at this point, at least not easily. We have a high query volume and fairly large 30+ GB innodb tables. Both servers are running all innodb and are also both on mysql 5.1. What can we do to sync the data initially to get replication started? I've tried a few options, but so far, none have worked.

    Read the article

  • WCF REST Service not working - not showing anything

    - by casperrawr
    I've been scratching my head for the past 20 hrs or so trying to figure out what is wrong with my rudimentary WCF app but with absolutely no luck :( I was following this tutorial: http://www.c-sharpcorner.com/UploadFile/dhananjaycoder/RESTEnabledService05122009034907AM/RESTEnabledService.aspx and for some reason the WCF is showing a blank page. I checked IIS, reinstalled .NET 4.0, cleaned and redid .svn handlers, tried on different test servers...and still, nada. Do you know what might be wrong with the configuration? I figured the code is simple enough (essentially the same as the page I posted) so it can't be the code itself...right? any help will be appreciate :)

    Read the article

  • Webserver for publishing adjusted images

    - by Petr Prikryl
    I want to create webserver which will be receiving data (images probably) from webclients who will be uploading these photos. And then the webserver must execute some graphic programs (client select whichone) to modify the image(s). After the graphic program is done, the results (output image(s)) will be uploaded to the web temp database (by webserver) to publish results of graphic programs. I found the C++ web toolkit (http://www.webtoolkit.eu), but I'm not sure if this is a suitable C/C++ library for me because I haven't any experiences with making C/C++ web apps/servers. Can someone tell me some advice please? The webserver should have some SDK too, for adding next graphic programs. I'm searching only for any C/C++ tools/libraries/etc.

    Read the article

  • Recommend a local LDAP store for development

    - by Paul Stovell
    Our project uses an LDAP repository for storing users. In production this will be Active Directory. For development, we seem to have a couple of options: Install an AD LDS instance that everyone uses Install an AD LDS instance on every developer machine We're trying to keep the 'F5' experience as lightweight as possible, so installing things or relying on a central AD store aren't my favorite ideas. There are other LDAP servers, like Open LDAP. I was hoping there might be an LDAP server that simply talks to an XML file. This would allow us to store the XML file in source control and have something that is fast and works. Our nightly builds would still use AD to pick up any differences, but the hope is since we're using LDAP it should Just Work. Can you recommend an LDAP implementation that works well for zero-config shared-nothing development?

    Read the article

  • Merging tables in MySQL - sum up columns

    - by Alan Williamson
    I have an interesting problem, that i am sure has a simple answer, but i can't seem to find it in the docs. I have two separate database tables, on different servers. They are both identical table schema with the same primary keys. I want to merge the tables together on one server. But, if the row on Server1.Table1 exists in Server2.Table2 then sum up the totals in the columns i specify. Table1{ column_pk, counter }; "test1", 3 "test2", 4 Table2{ column_pk, counter }; "test1", 5 "test2", 6 So after i merge i want: "test1",8 "test2",10 Basically i need to do a mysqldump but instead of it kicking out raw INSERT statements, i need to do a INSERT..ON DUPLICATE KEY UPDATE statements. What are my options? Appreciate any input, thank you

    Read the article

  • How can I retrieve cookies for webserver A when my project is deployed on webserver B?

    - by medopal
    The project is multiple modules, each of them is deployed to a separate webserver. All of them on the same mainframe. (same IP address) I have a main menu where I login and then list all the available modules on all servers. From here I can click and go to any of them modules. I send cookies in the response (when logging in, say Server A), then on Server B (one of the modules) when I want to go back to the main menu, I check the cookies to see if the user is logged in. The problem is, Server B isn't seeing cookies generated by Server A. So each time I return to main menu, the user will be logged out. Is there anyway to store cookies to be used by multiple virtual webservers (on same IP) or any other idea?

    Read the article

  • Processing SMTP bounces with .net

    - by justSteve
    Am looking for examples specific to .net/mvc and servers native WinServer08 where problem being addressed is processing a bounced smtp msg so as to bind to an estore transaction and updating account/profile properties. Reading the related questions i find an interesting reference to [VERP]2. Under the heading 'Software that supports VERP i find that IIS is not on the list. Does that mean i need to find a library to integrate into my store's assembly? What resources do I have to pull together to make sure that the webapp is informed when mail bounces? fwiw - i'm working with a very low volume site.

    Read the article

  • Force download menu for remote files

    - by o-logn
    Hey, I would like users to upload links on my site. When another user clicks on the link (e.g. PDF file), then I would like the download popup to show instead of actually displaying the PDF in browser. I know I can use Response.AddHeader/Response.WriteFile to achieve this, but the WriteFile method required a virtual path. However, the links uploaded by the user will be pointing to external servers. Can I still force the download popup to show and, if so, what would be the most efficient way of doing it? Thanks for any advice

    Read the article

  • Do you leave Windows Automatic Updates enabled on your production IIS server?

    - by Nobody
    If you were running a 24/7 website on Windows Server 2003 (IIS6). Would you leave the Windows automatic update feature enabled or would you turn it off? When enabled, you always get the latest security patches and bug fixes automatically as soon as they're available, which is the most secure choice. However, the machine will sometimes get automatically rebooted to apply the updates leading to a couple of minutes of downtime in the middle of the night. Also, I've seen rare occasions where the machine does not restart correctly resulting in further downtime. If auto updates are off, when do you apply the patches? I guess you have to use a load balancer with multiple web servers and rotate them out of the production site, apply patches manually, and put them back in. This can be logistically inconvenient when the load balancer is managed by a hosting company. You will also have machines in production that don't always have the latest security patches and you have to routinely spend time deciding which patches to apply and when.

    Read the article

  • System.Net.Mail.SmtpClient cannot authenticate against a POP3 server, right?

    - by Herchu
    One of our customer seems to have a very old email system, those that ask you to authenticate to the POP3 server before allowing you to send messages through the SMTP server. Regrettably, we have to believe in what our customer tell us for we cannot access their facilities. But as far as I remember, years ago there were mail systems that once you log into the POP3, the STMP server is kept open for a few minutes for the client IP. Our application sends messages by using System.Net.Mail.SmtpClient which seems to be unable to authenticate to those kinds of servers. Is that correct? If so, what would be the simplest workaround? I was thinking of a minimal POP3 implementation (just the login part of the protocol). Would that work? Thanks in advance.

    Read the article

  • Should I Split Tables Relevant to X Module Into Different DB? Mysql

    - by Michael Robinson
    I've inherited a rather large and somewhat messy codebase, and have been tasked with making it faster, less noodly and generally better. Currently we use one big database to hold all data for all aspects of the site. As we need to plan for significant growth in the future, I'm considering splitting tables relevant to specific sections of the site into different databases, so if/when one gets too large for one server I can more easily migrate some user data to different mysql servers while retaining overall integrity. I would still need to use joins on some tables across the new databases. Is this a normal thing to do? Would I incur a performance hit because of this?

    Read the article

  • Accepted date format changed overnight

    - by Eugene Niemand
    Since yesterday I started encountering errors related to date formats in SQL Server 2008. Up until yesterday the following used to work. EXEC MyStoredProc '2010-03-15 00:00:00.000' Since yesterday I started getting out of range errors. After investigating I discovered the date as above is now being interpreted as "the 3rd of the 15th month" which will cause a out of range error. I have been able to fix this using the following format with the "T". EXEC MyStoredProc '2010-03-15T00:00:00.000' By using this format its working fine. Basically all I'm trying to find out is if there is some Hotfix or patch that could have caused this, as all my queries using the first mentioned formats have been working for months. Also this is not a setting that was changed by someone in the company as this is occurring on all SQL 2005/2008 servers

    Read the article

  • Use of bit-torrent for large file download as an alternative to FTP

    - by questzen
    The company I work for procures large volumes of data and does this by subscribing to FTP locations. I was wondering if it is possible to download the same using a tracker, the major challenge is authentication of the users IMO. Most ftp servers we subscribe to have a restriction of the number of ftp connection attempts. Does any one here have any experience with this? Any advice is welcome. Edit To clarify, we subscribe to third party vendors and access their ftp location using credentials provided by them. The service is not exclusive to us, they do sell their data to several others. If we could be part of the swarm, the download rates would be pretty high without added penalty. The question is about the possibility of achieving this, so that we can put-forth a proposal in those lines. The vendors obviously wouldn't share data to non-subscribers, so that is a constraint.

    Read the article

  • How long can a hash left out in the open be considered safe?

    - by Xeoncross
    If I were to leave a SHA2 family hash out on my website - how long would it be considered safe? How long would I have before I could be sure that someone would find a collision for it and know what was hashed? I know that the amount of time would be based on the computational power of the one seeking to break it. It would also depend on the string length, but I'm curious just how secure hashes are. Since many of us run web-servers we constantly have to be prepared for the day when someone might make it all the way to the database which stores the user hashes. So, move the server security out of the way and then what do you have? This is a slightly theoretical area for many of the people I have talked with, so I would love to actually have some more information about average expectations for cracking.

    Read the article

  • how to find out how much application memory django process is (or will be) taking?

    - by photographer
    There are different "Application memory" options (like 80MB...200MB) in django-friendly hosting called webfaction and I'm confused deciding which one I should buy. Could someone please walk me through the ideas on how to figure out how much memory my project might require (excluding operating system, the main apache server and the database servers memory requirements)? I understand in theory I'll need to perform some kind of load testing, but thought there might be ways to calculate that in advance with some simple/relatively easy understandable approach. I don't know how hard they enforce application memory usage limit, and another question is: what will happen if more users came to the site and more threads started than what I expected? Will the application crash? Or will delays just become uncomfortable? And - no, application is not ready yet (I can't measure anything right now). Development environment if it matters is Winodows 7, 64-bit. Hosting itself is some kind of Linux I think. (Sorry if it's not a stackoverflow question.)

    Read the article

  • How can I force a subscriber to be synchronized from a local snapshot?

    - by Brian
    Hello, I have a SQL 2005 server replicating(merge\push) to SQL 2005 and SQL 2000 servers. I have multiple subscribers spread througout the United states. I have set , @snapshot_in_defaultfolder = N'false', @alt_snapshot_folder = N'c:\snapshots\Merge\' (sample location). I take the snapshot from the publisher that is in the same location, 'c:\snapshots\Merge\', and copy it to the subscribers. I wanted to avoid applying the snapshot over the WAN but from the performance I am getting the synchronization is going over the WAN. Does anybody have any ideas how to make sure that I am using the local copy of the snapshot and not the copy at the publisher? Thanks

    Read the article

  • .NET vs Mono differences in Development

    - by jason
    I'm looking into Mono and .NET C#, we'll be needing to run the code on Linux Servers in the future when the project is developed. At this point I've been looking at ASP.NET MVC and Mono I run an ubuntu distro and want to do development for a web application, some of the other developers use windows and run other .NET items with Visual Studio. What does Mono not provide that Visual Studio does? If running this on Linux later shouldn't we use Mono Develop ? Is there some third party tools or addin's that might be an issue with Mono later?

    Read the article

  • Visual Studio 2010 Database Project does not understand Schema Names anymore?

    - by Xenan
    I just tried to upgrade a Visual Studio 2008 Database project to VS2010 and actually it is quite a mess. Hundreds of warnings, all unsolved references. It seems to boil down to Visual Studio not to understand Schema Names (aka Ownership) anymore. For example, the standard dbo schema: [$(MyDataBase)].dbo.MyTable is fine but: [$(MyDataBase)].myschema.MyTable gives an unsolved reference. It did work in VS2008. Also the abbreviation for dbo, the double dot: [$(MyDataBase)]..MyTable Doesn't work anymore. In the project property windows I restored the references to the correct servers (which were lost after the conversion) but that didn't help. This seems pretty basic but I don't have a clue how to solve this. Any help is appreciated.

    Read the article

  • Is AppFapric mature for production.

    - by Incognito
    Hi, We have a lot of WCF services using as a host windows services. And as we are upgrading our servers to windows server 2008 R2 we are planning to migrate some of services under WAS. Also having already the release of AppFabric it is interesting does AppFabric is mature to be used, so may be we can use it instead of WAS. Is there already someone using in on production. And what are your impressions of course maximum objectively :). Thank you.

    Read the article

< Previous Page | 290 291 292 293 294 295 296 297 298 299 300 301  | Next Page >