Search Results

Search found 34010 results on 1361 pages for 'database connections'.

Page 38/1361 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • Authoritative sources about Database vs. Flatfile decision

    - by FastAl
    <tldr>looking for a reference to a book or other undeniably authoritative source that gives reasons when you should choose a database vs. when you should choose other storage methods. I have provided an un-authoritative list of reasons about 2/3 of the way down this post.</tldr> I have a situation at my company where a database is being used where it would be better to use another solution (in this case, an auto-generated piece of source code that contains a static lookup table, searched by binary sort). Normally, a database would be an OK solution even though the problem does not require a database, e.g, none of the elements of ACID are needed, as it is read-only data, updated about every 3-5 years (also requiring other sourcecode changes), and fits in memory, and can be keyed into via binary search (a tad faster than db, but speed is not an issue). The problem is that this code runs on our enterprise server, but is shared with several PC platforms (some disconnected, some use a central DB, etc.), and parts of it are managed by multiple programming units, parts by the DBAs, parts even by mathematicians in another department, etc. These hit their own platform’s version of their databases (containing their own copy of the static data). What happens is that every implementation, every little change, something different goes wrong. There are many other issues as well. I can’t even use a flatfile, because one mode of running on our enterprise server does not have permission to read files (only databases, and of course, its own literal storage, e.g., in-source table). Of course, other parts of the system use databases in proper, less obscure manners; there is no problem with those parts. So why don’t we just change it? I don’t have administrative ability to force a change. But I’m affected because sometimes I have to help fix the problems, but mostly because it causes outages and tons of extra IT time by other programmers and d*mmit that makes me mad! The reason neither management, nor the designers of the system, can see the problem is that they propose a solution that won’t work: increase communication; implement more safeguards and standards; etc. But every time, in a different part of the already-pared-down but still multi-step processes, a few different diligent, hard-working, top performing IT personnel make a unique subtle error that causes it to fail, sometimes after the last round of testing! And in general these are not single-person failures, but understandable miscommunications. And communication at our company is actually better than most. People just don't think that's the case because they haven't dug into the matter. However, I have it on very good word from somebody with extensive formal study of sociology and psychology that the relatively small amount of less-than-proper database usage in this gigantic cross-platform multi-source, multi-language project is bureaucratically un-maintainable. Impossible. No chance. At least with Human Beings in the loop, and it can’t be automated. In addition, the management and developers who could change this, though intelligent and capable, don’t understand the rigidity of this ‘how humans are’ issue, and are not convincible on the matter. The reason putting the static data in sourcecode will solve the problem is, although the solution is less sexy than a database, it would function with no technical drawbacks; and since the sharing of sourcecode already works very well, you basically erase any database-related effort from this section of the project, along with all the drawbacks of it that are causing problems. OK, that’s the background, for the curious. I won’t be able to convince management that this is an unfixable sociological problem, and that the real solution is coding around these limits of human nature, just as you would code around a bug in a 3rd party component that you can’t change. So what I have to do is exploit the unsuitableness of the database solution, and not do it using logic, but rather authority. I am aware of many reasons, and posts on this site giving reasons for one over the other; I’m not looking for lists of reasons like these (although you can add a comment if I've miss a doozy): WHY USE A DATABASE? instead of flatfile/other DB vs. file: if you need... Random Read / Transparent search optimization Advanced / varied / customizable Searching and sorting capabilities Transaction/rollback Locks, semaphores Concurrency control / Shared users Security 1-many/m-m is easier Easy modification Scalability Load Balancing Random updates / inserts / deletes Advanced query Administrative control of design, etc. SQL / learning curve Debugging / Logging Centralized / Live Backup capabilities Cached queries / dvlp & cache execution plans Interleaved update/read Referential integrity, avoid redundant/missing/corrupt/out-of-sync data Reporting (from on olap or oltp db) / turnkey generation tools [Disadvantages:] Important to get right the first time - professional design - but only b/c it's meant to last s/w & h/w cost Usu. over a network, speed issue (best vs. best design vs. local=even then a separate process req's marshalling/netwk layers/inter-p comm) indicies and query processing can stand in the way of simple processing (vs. flatfile) WHY USE FLATFILE: If you only need... Sequential Row processing only Limited usage append only (no reading, no master key/update) Only Update the record you're reading (fixed length recs only) Too big to fit into memory If Local disk / read-ahead network connection Portability / small system Email / cut & Paste / store as document by novice - simple format Low design learning curve but high cost later WHY USE IN-MEMORY/TABLE (tables, arrays, etc.): if you need... Processing a single db/ff record that was imported Known size of data Static data if hardcoding the table Narrow, unchanging use (e.g., one program or proc) -includes a class that will be shared, but encapsulates its data manipulation Extreme speed needed / high transaction frequency Random access - but search is dependent on implementation Following are some other posts about the topic: http://stackoverflow.com/questions/1499239/database-vs-flat-text-file-what-are-some-technical-reasons-for-choosing-one-over http://stackoverflow.com/questions/332825/are-flat-file-databases-any-good http://stackoverflow.com/questions/2356851/database-vs-flat-files http://stackoverflow.com/questions/514455/databases-vs-plain-text/514530 What I’d like to know is if anybody could recommend a hard, authoritative source containing these reasons. I’m looking for a paper book I can buy, or a reputable website with whitepapers about the issue (e.g., Microsoft, IBM), not counting the user-generated content on those sites. This will have a greater change to elicit a change that I’m looking for: less wasted programmer time, and more reliable programs. Thanks very much for your help. You win a prize for reading such a large post!

    Read the article

  • Filezilla FTP Server Ports - Active Connections

    - by Brian Webster
    I have been obtaining errors like below because I did not specify enough ports for the active FTP connections. Response: 150 Opening data channel for directory list. Response: 425 Can't open data connection.Error: Failed to retrieve directory listing Things seem to work nicely with limited ports, but when I perform actions that cause very rapid short-lived connections, something like 20-30% of the connections drop with the error above. I started with ports 50000-50100. When I opened up to 50000-52000, the errors disappeared. Why did this fix my problem? I would like to understand why adding ports fixed it. I have a suspicion that ports become "locked down" for a few moments surrounding when they are used in a connection. If connections are happening so rapidly, there may be no ports available, thus the above error. Can anybody confirm?

    Read the article

  • restore content database in sharepoint server 2007

    - by Boris
    I have a site collection set up at web app running at port 80. I have made the backup of the site collection content db using stsadm.exe tool. Now, I want to restore that backup as a new content db of a different site collection - the one set up at web app running at port 500. I have done the following: Created a backup Created new web app at port 500 (I did not create a site collection for this web app) I have removed the content db of that new web app using Central Administration I have run the stsadm.exe -o addcontentdb -url webapp-at-port-500 -databasename Command is successfully completed, however when I check the Content Database page for that web app, it says that the Number of Sites is 0! Also, when I try to open http://webapp-at-port-500, I get the error saying that the webpage cannot be found. Could anyone please help me, it's driving me crazy. Thanks.

    Read the article

  • Shareing two internet connections on my laptop running Windows XP

    - by ashwnacharya
    I have two internet connections, one is internet via our organization's corporate LAN network, and the other one is mobile broadband via a USB modem Is there anyway I can share internet connections and use them simultaneously? I want to use the corporate LAN network for normal browsing and connecting my email client, and I want to use the USB modem for establishing a VPN connection. Will I be able to maintain both the connections simultaneously? Can I have parallel downloads, one using our corporate network, and the other one using the mobile broadband? Will I be able to switch my browser between these two connections? My laptop runs Windows XP Service Pack 2.

    Read the article

  • SQL Server 2000 msdb database loading/suspect

    - by Blake Parcell
    My SQL Server recently suffered a raid controller/hard drive crash. After getting my hard drive problem corrected I soon found that some of my databases were (suspect) namely msdb. I am not a DBA by any means however am somewhat familiar with the daily SQL activities that happen on my server. So I restored from backup, and tried to bring my msdb database online. It is now forever stuck in (Loading\Suspect) and I am unable to script backups for my important databases. I can recreate all of the backup plans etc if i can somehow get a working msdb. Any help would be greatly appreciated. I am currently using: Microsoft SQL Server 2000 Version: 8.00.194

    Read the article

  • Light-weight, free, database query tool for Windows?

    - by NoCatharsis
    My question is very similar to the one here except pertaining to a Windows tool. I am also referencing this table and what I found here with a Google search. However, I have no idea which tool would best meet my (very basic) purposes. I am currently using Excel with a basic ODBC connection string to query my database at work. However, Excel is pretty memory-heavy and a basic query tends to throw my computer into a 30 second stall-a-thon. Is there a free tool out there that is light-weight and can serve the same purpose when provided an ODBC connection and a SQL query? Also would prefer that it easily copies over to a spreadsheet as needed.

    Read the article

  • Downloading Database dump from server

    - by Ctroy
    I have a mysql database on a server that is around 4 gb in size and I couldn't get it downloaded to my local machine. I tried getting a dump on the server, but the dump is not getting created probably because of the big size. Is there any way, I could get the dump downloaded on my local machine? I can syncing using Sqlyog but I know it will take ages. Is there a way, I can get a dump created on the server? By the way, my server is a linux server and its running php/mysql.

    Read the article

  • How does fail2ban 0.9 database storage actually works?

    - by Arantir
    Fail2ban 0.9 introduce database storage to save bans on restart. But I can't find out the actual mechanism of it work. There is dbpurgeage parameter which controls lifetime of old bans, defaults to 24 hours. As I see from code research, fail2ban saves a ban to the db with timeofban equals to the moment of ban being saved. Then every dbpurgeage period it removes all bans with timeofban < MyTime.time() - self._purgeAge, in other words removes all bans have been stored more than 24 hours ago. But what if an IP was banned for the month? Does all this mean that with dbpurgeage = 86400 after restart in 24 hours I will lost all bans longer than 24 hours? I just want that all my permanent bans will be preserved in any case.

    Read the article

  • MySQL problem with many concurrent connections

    - by user48303
    Hi, here's a sixcore with 32 GB RAM. I've installed MySQL 5.1.47 (backport). Config is nearly standard, except max_connections, which is set to 2000. On the other hand there is PHP5.3/FastCGI on nginx. There is a very simple php application which should be served. NGINX can handle thousands of request parallel on this machine. This application accesses MySQL via mysqli. When using non-persistent connections in mysqli there is a problem when reaching 100 concurrent connections. [error] 14074#0: *296 FastCGI sent in stderr: "PHP Warning: mysqli::mysqli(): [2002] Resource temporarily unavailable (trying to connect via unix:///tmp/mysqld.sock) in /var/www/libs/db.php on line 7 I've no idea to solve this. Connecting via tcp to mysql is terrible slow. The interesting thing is, when using persistent connections (add 'p:' to hostname in mysqli) the first 5000-10000 thousand requests fail with the same error as above until max connections (from webserver, set to 1500) is reached. After the first requests MySQL keeps it 1500 open connections and all is fine, so that I can make my 1500 concurrent requests. Huh? Is it possible, that this is a problem with PHP FastCGI?

    Read the article

  • Several Server Errors (No database connect, can't create TCP/IP socket etc

    - by Tobias Baumeister
    My server stops taking requests on my website today. It works for some time, but then the server just stops working and throws several errors: 500 Internal Server Error Warning: mysql_connect(): Can't create TCP/IP socket (105) in [...] on line 7 Couldn't connect to database. Please try again. mysql_connect(): Host [...] is blocked because of many connection errors; unblock with 'mysqladmin flush-hosts' mod_fcgid: can't apply process slot for /var/www/cgi-bin/cgi_wrapper/cgi_wrapper (this is from Error Log) Any ideas what might cause it? operating system is Ubuntu

    Read the article

  • Setting up MySQL database replication [without restarting mysql]

    - by FunkyChicken
    I'm trying to setup MySQL db replication, it seems pretty straight forward. I was using this tutorial: http://www.howtoforge.com/mysql_database_replication Now I run a rather large MySQL database for a very large website, and in this tutorial it asks me to restart MySQL to apply the new settings in the /etc/my.cnf file. I'm try to avoid that step at all costs, as I know that restarting MySQL can take a few minutes on my machine (due to large logs/dbs), and I don't want any downtime. Is there a way to apply the necessary settings WITHOUT fully restarting Mysql?

    Read the article

  • Take all fields in a database table and put them straight into a text file

    - by DalexL
    I have an database file (mdb) file that contains a dictionary of words. A couple thousand of them. I just need the words (in the order they are already in) put into a text file. Currently they have ID's associated with them (e.g. 1, 2, 3) but I don't need it. I just need the words. What is the best way to do this? Actually, if somebody is able to find a dictionary of English words (something along the lines of a scrabble dictionary) that is free online, I'll accept that too. I just can't seem to find any good ones online.

    Read the article

  • TCP Server Memory management: #Connections Vs. #Requests

    - by Andrew
    Given that, there is no theoretical limit to number of concurrent TCP connections a Windows 2008 server can handle. Only thing will happen is, with each connection there will be memory consumption in server. Unfortunately, memory is not unlimited (and I want to utilize only physical memory). For example, lets say we've 2GB server memory. Now there are two extreme cases: Case 1: If we've allocated 64KB buffer for each connection (only to receive incoming request), then 32768 connections can consume all the 2GB of memory. This will not leave any memory to queue/process incoming requests from those connections. Case 2: On the other hand, lets say a single (or very few) connections continuously keeps sending request buffers (for example, video streaming from one connection to other) and server cannot process them within time, those buffers will get piled up in server and eventually will occupy most of the servers memory. And it will not leave any memory for new connection thereafter. This is the real dilemma in server design bugging me badly for last many days. If I can decide on max size of request buffer per connection and max number of requests to allow in queue per connection. Then, based on available server memory, it will then automatically set limit on max number of concurrent connections. How to decide on these limits to achieve best performance and throughput? I am just looking for perfect utilization of server resources. Are there any standard guidelines or empirical data available with someone who can share with me please.

    Read the article

  • Maximum execution time of 300 seconds exceeded error while importing large MySQL database

    - by Spacedust
    I'm trying to import 641 MB MySQL database with a command: mysql -u root -p ddamiane_fakty < domenyin_damian_fakty.sql but I got an error: ERROR 1064 (42000) at line 2351406: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '<br /> <b>Fatal error</b>: Maximum execution time of 300 seconds exceeded in <b' at line 253 However limits are set much higher: mysql> show global variables like "interactive_timeout"; +---------------------+-------+ | Variable_name | Value | +---------------------+-------+ | interactive_timeout | 28800 | +---------------------+-------+ 1 row in set (0.00 sec) and mysql> show global variables like "wait_timeout"; +---------------+-------+ | Variable_name | Value | +---------------+-------+ | wait_timeout | 28800 | +---------------+-------+ 1 row in set (0.00 sec)

    Read the article

  • MySql calculate number of connections needed

    - by Udi I
    I am trying to figure my needs regarding web service hosting. After trying Azure I have realized that the default MySql they provide (through a third party) limits the account to 4 connections. You can then upgrade the account to 15, 30 or 40 connections (which is quite expensive). Their 15 connections plan is descirbed as: "Excellent choice for light test and staging apps that need a reliable MySQL database". I have 2 questions: if my application is a web service which needs to preform ~120k Queries a day (Normal/BELL distribution) and each query is ~150ms(duration)/~400ms(fetch), how many connection do I need? If instead of using cloud computing, I will choose a VPS, how many connections will I be able to handle on a 1GB 2 cores VPS? Thank you!

    Read the article

  • Lingering database-connections from Feng Office

    - by Bobby
    I've installed Feng Office on our main server which is working perfectly so far. Unfortunately it seems like there's a problem with the connection to the MySQL-Database. While the connection itself works fine, it's the reuse/pooling of connections which seems to be bugged. There are lingering/sleeping connections to the server from Feng Office which won't close and don't get reused after some time (120 seconds). Of course those lingering processes/connections are piling up pretty fast. I've found a thread at the forums about this behavior, but the suggested fix is already applied (by default). I'm sure this is just a configuration issue, but I'm a little clue less because Feng is besides a MediaWiki, a DokuWiki and homebrewed PHP applications the only one with this issue. The setup is a Microsoft Windows 2003 Server with MySQL 5.0.26 and Apache 2.2. Where can I start looking for clues why this is happening and how do I get rid of lingering MySQL-Connections?

    Read the article

  • Easy management of multiple connections

    - by pistacchio
    Hi, on a Windows 7 machine I need to switch back and forth two connections: the LAN connection (that works behind a proxy) and a wireless connection via a USB modem. This requires every time to activate / deactivate two connections in the manager and switch on / off the LAN settings. I don't think there is some way to keep two connections alive and specify which one to use based on the program, but my question is: is there a simple little utility that makes switching the connection one-click easy and fast? Thanks

    Read the article

  • Looking for a central image database and tagging system for a group of users

    - by jstarek
    I'm doing IT support for a small volunteer organisation who needs to centrally store and organize around 2500 photos. Can anyone here recommend a database or similar system which matches the following criteria: Intuitive to use for users with little computer experience Multi-user support, ideally with integration in our existing LDAP user directory Should have a web-interface Not a hosted solution like Picasa (because we have a rather slow internet connection with very slow upload) Should allow tagging of images, sorting by various criteria and storing copyright information If there are native GNOME and/or Windows clients for the tool, that would be a great benefit. Many thanks in advance!

    Read the article

  • Using mongodump with an auth enabled mongodb server

    - by bb-generation
    I'm trying to do a daily backup of my mongodb server (auth enabled) using the mongodump tool. mongodump provides two parameters to set the credentials: -u [ --username ] arg username -p [ --password ] arg password Unfortunately they don't provide any parameter to read the password from stdin. Therefore everytime I run this command, everyone on the server can read the password (e.g. by using ps aux). The only workaround I have found is stopping the database and directly accessing the database files using the --dbpath parameter. Is there any other solution which allows me to backup the mongodb database without stopping the server and without "publishing" my password? I am using Debian squeeze 6.0.5 amd64 with mongodb 1.4.4-3.

    Read the article

  • Installing Oracle11gr2 on redhat linux

    - by KItis
    I have basic question about installing applications on linux operating system. i am going to express my issue considering oracle db installation as a example. when installing oracle database , i created a user group called dba and and user in this group called ora112. so this users is allowed to install database. so my question is if ora112 uses umaks is set to 077, then no other uses will be able to configure oracle database. why do we need to follow this practice. is it a accepted procedure in application installation on Linux. please share your experience with me. thanks in advance for looking into this issue say i install java application on this way. then no other application which belongs to different user account won't be able use java running on this computer because of this access restriction.

    Read the article

  • Connections to IIS sometimes get stuck in CLOSE_WAIT state

    - by randomhuman
    Our application includes an ASP.Net web service that only needs to deal with a handful of clients. As such, the 10 incoming connection limit of Windows XP Pro is generally not a problem. However, on one particular server, connections are occasionally becoming stuck in the CLOSE_WAIT state. These connections build up over time and eventually new client connections are refused because the maximum number of connections are used up. From my googling it sounds like a failure of the webservice to properly close the connection can cause this problem, but as it works just fine on hundreds of other Windows XP pro machines I can't see it being a bug in our code. It also ran fine on the affected machine until some shenanigans on the part of the end user (I think they set about deleting duplicate files in order to reduce their disk usage, but they did not exactly come clean about it). What could the user have changed to introduce this problem? Is there any way I can force connections that are in CLOSE_WAIT to time out rather than letting them hang around? I have seen suggestions to reduce TcpTimedWaitDelay, but that only relates to the TIME_WAIT state, and changing it did not have any effect.

    Read the article

  • Clients can make maximum only 15 connections to ubuntu custom server

    - by gtan
    I have a custom server in C# being run on Ubuntu 9 under mono. I can make up to 15 silverlight clients connect to the server. When I make the 16th, it just waits. And if I close one of the established connections, the 16th client is able to connect. I am making the connections from one machine. I am also not exceeding any file handle limit. The limit is 1024 and I am having around 300. Any ideas how to make more connections? Also why the number 15? Is it something linux-specific? EDIT: I have run the same server on an Ubuntu 11.10 virtual machine and was able to make more than 15 connections. I presume it's a configuration problem on the Ubuntu 9 machine then. Any help on that?

    Read the article

  • Maximum number of outgoing VPN connections?

    - by icray
    I'm working on a project where we would like to have around 50 outgoing (not incoming) VPN connections. We could use PPTP or IPSEC. Security isn't an issue, we control all the sites and users won't have access. Windows Server appears to have a limit of 5 outgoing connections. Does anyone know if it's possible to tweak it to allow more? Or can we get this many outgoing VPN connections with Linux?

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >