Search Results

Search found 9845 results on 394 pages for 'ntp servers'.

Page 296/394 | < Previous Page | 292 293 294 295 296 297 298 299 300 301 302 303  | Next Page >

  • Should I Split Tables Relevant to X Module Into Different DB? Mysql

    - by Michael Robinson
    I've inherited a rather large and somewhat messy codebase, and have been tasked with making it faster, less noodly and generally better. Currently we use one big database to hold all data for all aspects of the site. As we need to plan for significant growth in the future, I'm considering splitting tables relevant to specific sections of the site into different databases, so if/when one gets too large for one server I can more easily migrate some user data to different mysql servers while retaining overall integrity. I would still need to use joins on some tables across the new databases. Is this a normal thing to do? Would I incur a performance hit because of this?

    Read the article

  • System.Net.Mail.SmtpClient cannot authenticate against a POP3 server, right?

    - by Herchu
    One of our customer seems to have a very old email system, those that ask you to authenticate to the POP3 server before allowing you to send messages through the SMTP server. Regrettably, we have to believe in what our customer tell us for we cannot access their facilities. But as far as I remember, years ago there were mail systems that once you log into the POP3, the STMP server is kept open for a few minutes for the client IP. Our application sends messages by using System.Net.Mail.SmtpClient which seems to be unable to authenticate to those kinds of servers. Is that correct? If so, what would be the simplest workaround? I was thinking of a minimal POP3 implementation (just the login part of the protocol). Would that work? Thanks in advance.

    Read the article

  • How long can a hash left out in the open be considered safe?

    - by Xeoncross
    If I were to leave a SHA2 family hash out on my website - how long would it be considered safe? How long would I have before I could be sure that someone would find a collision for it and know what was hashed? I know that the amount of time would be based on the computational power of the one seeking to break it. It would also depend on the string length, but I'm curious just how secure hashes are. Since many of us run web-servers we constantly have to be prepared for the day when someone might make it all the way to the database which stores the user hashes. So, move the server security out of the way and then what do you have? This is a slightly theoretical area for many of the people I have talked with, so I would love to actually have some more information about average expectations for cracking.

    Read the article

  • Accepted date format changed overnight

    - by Eugene Niemand
    Since yesterday I started encountering errors related to date formats in SQL Server 2008. Up until yesterday the following used to work. EXEC MyStoredProc '2010-03-15 00:00:00.000' Since yesterday I started getting out of range errors. After investigating I discovered the date as above is now being interpreted as "the 3rd of the 15th month" which will cause a out of range error. I have been able to fix this using the following format with the "T". EXEC MyStoredProc '2010-03-15T00:00:00.000' By using this format its working fine. Basically all I'm trying to find out is if there is some Hotfix or patch that could have caused this, as all my queries using the first mentioned formats have been working for months. Also this is not a setting that was changed by someone in the company as this is occurring on all SQL 2005/2008 servers

    Read the article

  • Use of bit-torrent for large file download as an alternative to FTP

    - by questzen
    The company I work for procures large volumes of data and does this by subscribing to FTP locations. I was wondering if it is possible to download the same using a tracker, the major challenge is authentication of the users IMO. Most ftp servers we subscribe to have a restriction of the number of ftp connection attempts. Does any one here have any experience with this? Any advice is welcome. Edit To clarify, we subscribe to third party vendors and access their ftp location using credentials provided by them. The service is not exclusive to us, they do sell their data to several others. If we could be part of the swarm, the download rates would be pretty high without added penalty. The question is about the possibility of achieving this, so that we can put-forth a proposal in those lines. The vendors obviously wouldn't share data to non-subscribers, so that is a constraint.

    Read the article

  • How can I force a subscriber to be synchronized from a local snapshot?

    - by Brian
    Hello, I have a SQL 2005 server replicating(merge\push) to SQL 2005 and SQL 2000 servers. I have multiple subscribers spread througout the United states. I have set , @snapshot_in_defaultfolder = N'false', @alt_snapshot_folder = N'c:\snapshots\Merge\' (sample location). I take the snapshot from the publisher that is in the same location, 'c:\snapshots\Merge\', and copy it to the subscribers. I wanted to avoid applying the snapshot over the WAN but from the performance I am getting the synchronization is going over the WAN. Does anybody have any ideas how to make sure that I am using the local copy of the snapshot and not the copy at the publisher? Thanks

    Read the article

  • .NET vs Mono differences in Development

    - by jason
    I'm looking into Mono and .NET C#, we'll be needing to run the code on Linux Servers in the future when the project is developed. At this point I've been looking at ASP.NET MVC and Mono I run an ubuntu distro and want to do development for a web application, some of the other developers use windows and run other .NET items with Visual Studio. What does Mono not provide that Visual Studio does? If running this on Linux later shouldn't we use Mono Develop ? Is there some third party tools or addin's that might be an issue with Mono later?

    Read the article

  • how to find out how much application memory django process is (or will be) taking?

    - by photographer
    There are different "Application memory" options (like 80MB...200MB) in django-friendly hosting called webfaction and I'm confused deciding which one I should buy. Could someone please walk me through the ideas on how to figure out how much memory my project might require (excluding operating system, the main apache server and the database servers memory requirements)? I understand in theory I'll need to perform some kind of load testing, but thought there might be ways to calculate that in advance with some simple/relatively easy understandable approach. I don't know how hard they enforce application memory usage limit, and another question is: what will happen if more users came to the site and more threads started than what I expected? Will the application crash? Or will delays just become uncomfortable? And - no, application is not ready yet (I can't measure anything right now). Development environment if it matters is Winodows 7, 64-bit. Hosting itself is some kind of Linux I think. (Sorry if it's not a stackoverflow question.)

    Read the article

  • Visual Studio 2010 Database Project does not understand Schema Names anymore?

    - by Xenan
    I just tried to upgrade a Visual Studio 2008 Database project to VS2010 and actually it is quite a mess. Hundreds of warnings, all unsolved references. It seems to boil down to Visual Studio not to understand Schema Names (aka Ownership) anymore. For example, the standard dbo schema: [$(MyDataBase)].dbo.MyTable is fine but: [$(MyDataBase)].myschema.MyTable gives an unsolved reference. It did work in VS2008. Also the abbreviation for dbo, the double dot: [$(MyDataBase)]..MyTable Doesn't work anymore. In the project property windows I restored the references to the correct servers (which were lost after the conversion) but that didn't help. This seems pretty basic but I don't have a clue how to solve this. Any help is appreciated.

    Read the article

  • Is AppFapric mature for production.

    - by Incognito
    Hi, We have a lot of WCF services using as a host windows services. And as we are upgrading our servers to windows server 2008 R2 we are planning to migrate some of services under WAS. Also having already the release of AppFabric it is interesting does AppFabric is mature to be used, so may be we can use it instead of WAS. Is there already someone using in on production. And what are your impressions of course maximum objectively :). Thank you.

    Read the article

  • Writing a script to bypass college login page

    - by gtredcvb
    My college has a silly login page that requires you to download a whole bunch of garbage that a lot of us don't need (Norton Anti-virus, Antispyware software, etc.). We have to have them running to get on the internet on campus. Though, if you are on Linux, or at least set your user-agent to linux, the requirements are gone. We could easily use Firefox with the useragent switcher to bypass this, but it'd be nice to create a script that automates this. How would this be possible? I figure this could be written in python, and could grab the webpage with curl specifying a user agent? How would I go about posting the data back to the servers? Thanks

    Read the article

  • Is it possible to synchronize with Microsoft Sync Framework through FTP-transfers?

    - by Christian80
    I'm looking at synchronize methods between two databases and found Microsoft Sync Framework recently. I've been trying to investigate if it suits my needs. My scenario is the following: Two SQL-databases located in different geographical parts. The remote database can go without internet connection for days at times and for some locations the only means of communicating is with ftp transfers to the main server. So my question is: Is it possible to sync between two servers and send the sync-information and data through a ftp-server?

    Read the article

  • What's a good FOSS java servlet session replication solution

    - by Bossy Joe
    I work on a very high volume public website running on Tomcat 5.5. Currently we require stickiness to a particular server in order to maintain session. I'd like to start replicating session, but have had trouble finding a good FOSS solution. I've written my own Manager (using memcached as the store) but am having trouble dealing with race conditions if more than one server is handling the requests for the same user. Is there a solution out there I should be looking at? I'm looking for not just something that works as a fallback if stickiness fails, but that would work if user requests are regularly spread to multiple servers.

    Read the article

  • message queue : selection and sizing

    - by user238591
    Hi, I have 20 messages/s, each 1 - 1.5 Mbytes. I need High Availability (2 to 4 servers min). I need low latencey (high daily volume - full RAM prefered). I need persistent poisoned messages queue. Only few clients (about 16), locally. I can have 12-16G bytes RAM per server (brooker). Which JMS message queue / messaging would you recommend ? On what configuration (CPU/RAM) ? Can I propose optionnal NAS persistence (in case of final delivery failure) ? Thanks

    Read the article

  • Force rules for build and deployment

    - by Sazug
    Our web project is source-controlled with SVN. It contains MSBuild file to build local, test and production builds. We also use CruiseControl.NET to deploy production and test versions to servers manually (not after every commit). The question is how to check that if production deployment is being done using CC.NET web project is built using production build (not test or other)? How to force specific steps to be executed when building and deploying to production (like compress JS and CSS, compile with debug="false", etc...)? Now it is possible for every developer make changes in MSBuild file (so he/she can forget to compress JS on production build, etc.).

    Read the article

  • Can I use a local::lib if local::lib isn't installed globally and without eval-ing it in shell?

    - by xenoterracide
    I have a problem, I want to use local::lib; in a script. But because I need to use this script many places, I don't want to try adding the eval to bashrc, every time I install this script to a server. and I can't get local::lib installed globally (in the default @INC) on the servers. Is there any way I can use local::lib from within the script so that it knows where the module local::lib is without the eval that local::lib recommends and without installing it into a directory in the default @INC on the server?

    Read the article

  • An elegant / simple way to check whether internet is available or not.

    - by Trainee4Life
    I did a quick search on how to check whether Internet is available or not. Most of them talked about making InterOp calls to wininet.dll. One of the answers pointed towards System.Net.NetworkInformation namespace. Exploring the namespace I found a class Ping which could be used to pinging to our servers from code, and checking whether server is available or not. What I want to ask is how this solution compares to other solutions? Ping soPing = new Ping(); var soPingReply = soPing.Send("www.stackoverflow.com"); if (soPingReply.Status != IPStatus.Success) { // SO not available }

    Read the article

  • Started with a local git repo now I want to push my changes to a remote server

    - by Eliseo Soto
    Hi, I started a new project and created a local git repo with "git init" and now I have a few branches and everything works great. However since my webhosting company offers git hosting (if you're curious https://support.eapps.com/index.php?_m=knowledgebase&_a=viewarticle&kbarticleid=203) I'd like to push my entire repo to their servers to have a backup in the cloud in case something bad happens to my local repo. How can I make the remote repo the "origin" since the repo was started locally? Hope my question makes sense. Thanks, a Git newbie.

    Read the article

  • Active MQ showing more than one (ghost) consumer

    - by Shoey
    We have a system that uses ActiveMQ (Queues) - and have exactly one producer and one consumer (implemented as a Windows Service in .NET). Over the weekend, the infrastructure team had a reboot of the servers on the network, and from then on we noticed that there are more than one ghost consumer appearing that listens to the queue and we also suspect reads and deletes the messages. My questions are: is there any way from the Active MQ management console to find out what the consumers are (hostnames, etc). and Are there any scenarios in which inadvertent consumers get 'created'? For instance, there were suggestions about the active MQ journal folders getting corrupted after a reboot, or there is another suggestion that another machine with Active MQ broker automatically makes itself a consumer of all the queues on the main/live active mq server.

    Read the article

  • Good resources for versioning...

    - by stephmoreland
    I have a number of Windows servers at work that are used for staging web sites for clients while they are being created. I wanted to start using versioning on them so that when we work with outside vendors on a project, if/when they overwrite my work, I'd like to be able to go back and get the version before. My question is that I think I'm not looking for the correct terms in searching for information, but what kind of resources are there to learn how to install the software for versioning or a site to help me get started. Any and all suggestions would be appreciated. Steph

    Read the article

  • What is the proper way to handle a fully qualified domain in a GET request?

    - by Mark P Neyer
    I'm writing a proxy server. When I use curl to fetch a page, say http://www.foo.com/pants, curl makes the following request: GET /pants HTTP/1.1 When I have curl send that request through my local proxy, curl changes the GET request to: GET http://www.foo.com/pants HTTP/1.1 This change causes the foo.com server return a 404. Is foo.com broken? Or is the fully qualified domain name only meaningful to proxy servers? Should I always strip http://domain from the requests I send out? Thanks!

    Read the article

  • Search server 2008 express working with WSS 3.0 (error when crawl second web application (website) s

    - by tberube
    My search server 2008 express crawls 3 sharepoint servers and 1 windows file server. The file server and 2 other sharepoint server crawl all content and master and sub sites just fine. On the 3rd sharepoint server the default site crawl just fine...The second web application site (content database) crawls the top site and crawls all sub sites BUT The subsites get the below error in the crawl log. Deleted by the gatherer (The start address or content source that contained this item was deleted and hence this item was deleted.) I have checked the security rights (OK) I have checked the setting to crawl a SharePoint site (if wrong the top site would not work)... ???? Last part = I can crawl 3 other sharepoint sites (web applications/other content databases) on the same server...It seems to be just this one site...

    Read the article

  • Using memcached/APC for session storage?

    - by Industrial
    Hi everybody, I had some thoughts back ago about using memcached for session storage, but came to the conclusion that it wouldn't be sufficient in the event of one or more of the servers in the memcached pool were about to go down. A hybrid version is to save the main database (mySQL) from load caused by reads would be to work out a function that tries to fetch the data from the cache pool, and if that fails gets it from the database. After putting some more thought into it, I started to think about using APC cache for session related data. If our web server would go down, sessions would be lost either way, so storing them in a local APC or a localhost memcached server maybe isn't that bad? What's your experiences?

    Read the article

  • Will lock() statement block all threads in the proccess/appdomain?

    - by MikeJ
    Maybe the question sounds silly, but I don't understand 'something about threads and locking and I would like to get a confirmation (here's why I ask). So, if I have 10 servers and 10 request in the same time come to each server, that's 100 request across the farm. Without locking, thats 100 request to the database. If I do something like this: private static readonly object myLockHolder = new object(); if (Cache[key] == null) { lock(myLockHolder) { if (Cache[key] == null) { Cache[key] = LengthyDatabaseCall(); } } } How many database requests will I do? 10? 100? Or as much as I have threads?

    Read the article

  • What frameworks exist for data subscription and update?

    - by Timothy Pratley
    There is one server with multiple clients. The clients are viewing subsets of the servers entire data. If the data that a client is viewing changes, the client should be informed of the changes so that it displays the current data. Example: Two clients are viewing a list of users in an administration screen. One client adds a new user to the list and modifies the permissions of another user. The other client sees the changes propagated to their view. In the client side code I would like the users list to be updated by the framework itself, raising changed events such that it will be redrawn - similar to 'cells' or dataflow. I am looking specifically for a .NET or java implementation.

    Read the article

< Previous Page | 292 293 294 295 296 297 298 299 300 301 302 303  | Next Page >