Search Results

Search found 41135 results on 1646 pages for 'non relational database'.

Page 545/1646 | < Previous Page | 541 542 543 544 545 546 547 548 549 550 551 552  | Next Page >

  • rewrite redirect issue in debian squeeze

    - by hd01
    My server os is debian squeeze. I have these lines to redirect non-www to www in htaccess file of my website: RewriteCond %{HTTP_HOST} !^www\.example\.com$ [NC] RewriteRule ^(.*)$ http://www.example.com/$1 [L,R=301] but it cause this error in firefox: The page isn't redirecting properly Firefox has detected that the server is redirecting the request for this address in a way that will never complete. This problem can sometimes be caused by disabling or refusing to accept cookies. when I comment those lines in htaccess mysite appears but in non-www format. I'm sure it works well before on the Ubuntu . but I don't know why it doesn't work now. would you help me?

    Read the article

  • Do I need to have a company so that I can buy an SSL certificate that will display green at the address bar?

    - by André Pena
    I have a non-comercial website in which the users store some sensitive information so I feel the need to have a SSL certificate, but it seems that if I don't have a registered company I can't buy a green certificate. I have some related questions: Is it true that if I don't have a company, I can't have a green certificate? If I issue a standard (non-business) certificate that won't go green (from GoDadday, for instance), will it go red? Or will it have a less ugly display.. something more neutral that won't scary the user.

    Read the article

  • Is it faster to create indexes before or after data loading in MySQL?

    - by Josh Glover
    I have a data replication process that drops and recreates a few tables in a target database, then loads them up with data from a source database (running on another host, but that is immaterial to the question at hand). The target database does need primary keys and a few other indexes on its tables, but not during the data loading. I'm currently loading all of the data, then creating the indexes. However, index creation takes a pretty long time--30 minutes of my data loader's 5 and a half hour running time. My intuition tells me that creating the indexes at the end should be faster than creating them first, since the index would need to be rewritten with each insert. Can anyone tell me for sure which way is faster? FWIW, I'm running MySQL 5.1 with InnoDB tables.

    Read the article

  • Sql Server 2008 Create Foreign Key Manually

    - by tgriffiths
    I have inherited an old database which wasn't designed very well. It is a Sql Server 2008 database which is missing quite a lot of Foreign Key relationships. Below shows two of the tables, and I am trying to manually create a FK relationship between dbo.app_status.status_id and dbo.app_additional_info.application_id I am using SQL Server Management Studio when trying to create the relationship using the query below USE myDatabase; GO ALTER TABLE dbo.app_additional_info ADD CONSTRAINT FK_AddInfo_AppStatus FOREIGN KEY (application_id) REFERENCES dbo.app_status (status_id) ON DELETE CASCADE ON UPDATE CASCADE ; GO However, I receive this error when I run the query The ALTER TABLE statement conflicted with the FOREIGN KEY constraint "FK_AddInfo_AppStatus". The conflict occurred in database "myDatabase", table "dbo.app_status", column 'status_id'. I am wondering if the query is failing because each table already contains approximately 130,000 records? Please help. Thanks.

    Read the article

  • Web Server Users - Best Practice

    - by Toby
    I was wondering what is considered best practice when several developers/administrators require access to the same web server. Should there be one non-root user with a secure username and password unqiue to the web server which everyone logs in as or should there be a username for each person. I am leaning towards a username for each person to aid in logging etc however then does the same user keep the same credentials over several servers, or should at least their password change depending on the server they are on? Should any non-root user of the system be added to the sudoers file or is it best practice to leave everyone off it and only let root perform certain tasks? Any help would be greatly appreciated.

    Read the article

  • Web Server Users - Best Practice

    - by Toby
    I was wondering what is considered best practice when several developers/administrators require access to the same web server. Should there be one non-root user with a secure username and password unqiue to the web server which everyone logs in as or should there be a username for each person. I am leaning towards a username for each person to aid in logging etc however then does the same user keep the same credentials over several servers, or should at least their password change depending on the server they are on? Should any non-root user of the system be added to the sudoers file or is it best practice to leave everyone off it and only let root perform certain tasks? Any help would be greatly appreciated.

    Read the article

  • Looking for WYSIWYG tool to create and edit HTML5 based presentations (slides)

    - by peterp
    There are a lot of different implementations for HTML5 based slide presentations out there, like Google Slides or S5. But all that I have seen so far, seem to need a person being able to (and willing to) read and write HTML-Code. My company still uses Powerpoint, but some people are quite unhappy about its limitedness, e.g. the lack of possibilites to embed animation (other than just appear/disappear) without using flash. I'd love to suggest a state-of-the-art solution based on HTML5, but I don't even need to think about suggesting a solution where the project people need a techie to add or edit the content of a slide. I am not looking for an editor for non-technies to create complex HTML5/javascript based animations, of course, those should be done by a developer... basically non-technies should be capable of doing the stuff they are doing in powerpoint now. Thanks in advance for your suggestions, Peter

    Read the article

  • I am trying to rewrite a few links with htaccess

    - by Thorpe Obazee
    I have a few URLs and I need them to be rewrite'd to the ones below: http://domain.net/blog/posts http://domain.net/blog/posts/index http://domain.net/blog/posts/view/uri/non-working-holiday http://domain.net/blog/posts/view/uri/we-no-longer-offer http://domain.net/blog/posts/view/uri/festivals http://domain.net/blog/posts/view/uri/christmas-is-just-around-the-corner http://domain.net/posts/ http://domain.net/posts/index http://domain.net/posts/view/uri/non-working-holiday http://domain.net/posts/view/uri/we-no-longer-offer http://domain.net/posts/view/uri/festivals http://domain.net/posts/view/uri/christmas-is-just-around-the-corner I was hoping that my .htaccess will fix this but it doesn't: Options +FollowSymLinks IndexIgnore */* RewriteEngine on RewriteRule ^blog\/(.*)$ posts\/$1 [NC] # if a directory or a file exists, use it directly RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d # otherwise forward it to index.php RewriteRule . index.php

    Read the article

  • Monitor resolution messed up somehow

    - by Kelp
    I purchased the Westinghouse 22" LCD LCM-22w3 a few years ago, and now it's been acting up on me. I just booted into Windows 7(without changing any settings), and the default resolution is 1600x1024, and it allows me to select a refresh rate of up to 85 Hz(it didn't let me do that). I usually have my resolution set to 1680x1050 with a refresh rate of 60 Hz. Now, that resolution does not even appear in the list. Does anyone have any idea of what could be the problem and how to fix it? Edit: I am not sure if this will help, but when I go to change the screen resolution, the monitor is known as "Generic Non-PnP Monitor". It used to be referred to as "Generic PnP Monitor). I tried to disable Generic Non-PnP Monitor, but when I restart, it uses that monitor again. Edit 2: I created a custom .inf file using Powerstrip, but that does not work either. The monitor settings are being stubborn.

    Read the article

  • How to solve virtual host issue

    - by Webnet
    I have multiple sites all setup the same as below except "bk" has something else in it's place... NameVirtualHost *:80 <VirtualHost bk:80> ServerName bk DocumentRoot /var/www/bk.com/ </VirtualHost> and I get these errors when restarting apache: [Mon Jan 17 10:28:56 2011] [error] VirtualHost bk:80 -- mixing * ports and non-* ports with a NameVirtualHost address is not supported, proceeding with undefined results [Mon Jan 17 10:28:56 2011] [warn] NameVirtualHost bk:80 has no VirtualHosts I don't get it... the other 2 sites I have virtual host configurations for this exact same way don't throw any errors update One error message fixed - here's where I'm at now.. <VirtualHost bk:80> ServerName bk DocumentRoot /var/www/bk.com/ </VirtualHost> [Mon Jan 17 10:28:56 2011] [error] VirtualHost bk:80 -- mixing * ports and non-* ports with a NameVirtualHost address is not supported, proceeding with undefined results

    Read the article

  • How can I view a PDF in Firefox when the server specifies the wrong content type?

    - by Sam
    I am using Mozilla Firefox with a PDF viewer plug-in. The plug-in has been correctly associated with Adobe Reader files to view them in the browser in the settings. I would like to be able to view PDF files in Firefox rather than downloading them. This already works correctly when a web server indicates that a file has the Content-Type of application/pdf. However, some web servers provide other Content-Types for PDFs, such as application/octet-stream. (See this example of a PDF served with a non-pdf Content-Type.) I have looked at Firefox's MimeTypes.rdf file, and it appears to only support mapping applications based on file types for non-Internet-based files. How can I have Firefox view all PDF documents in-browser rather than only the ones with the application/pdf Content-Type?

    Read the article

  • debian dependencies (libssl-dev and libncurses5)

    - by RubyFreak
    I'm trying to install under RVM the Ruby Enterpise (REE) under debian My debian is squeeze (uname -r) 2.6.18-194.26.1.el5.028stab070.14xen i did try to install ree but it complains that it is missing libssl-dev and libreadline5-dev. I did update my lenny to squeeze, but i didn't update the kernel, since its a production server. The operational system is already updated and upgraded sources.list: deb http://ftp.de.debian.org/debian/ squeeze main contrib non-free deb-src http://ftp.de.debian.org/debian/ squeeze main contrib non-free deb http://security.debian.org/ squeeze/updates main deb-src http://security.debian.org/ squeeze/updates main I did try to install it using the following command: apt-get install libssl-dev libreadline5-dev But unfortunately i'm getting the following problems: The following packages have unmet dependencies: libreadline5-dev: Depends: libncurses5-dev but it is not going to be installed libssl-dev: Depends: libssl0.9.8 (= 0.9.8o-4squeeze1) but 0.9.8o-6 is to be installed E: Broken packages I was thinking to reinstall those packages and install again, but it has too many dependencies, and it is a production server, that i would like to know if there is any other way to fix it. Or at least to double check if it is necessary to reinstall both :-/

    Read the article

  • What is the largest flatscreen monitor available for PC use?

    - by Avery Payne
    I'll qualify this specifically (by order of preference): must have the highest diagonal measurement, widescreen or "normal" aspect ratio doesn't matter here, just the diagonal. must have the highest resolution available, which means 72 inches of 1280x1024 won't cut it. must not have a TV tuner built into it, I'm not looking for a TV set, this is a monitor! must be available at a retail outlet that caters to the general public, i.e. Best Buy, Sears, Costco (all of these examples are in the U.S., although you can suggest something from whatever chain is in your area/nation/geography). Non-retail or non-physical venues like eBay, or businesses that only cater to other businesses, do not qualify under this requirement. I should be able to walk into this place and purchase it, not just whip up an order online. If you are unsure about this requirement, just ask yourself: can I physically see it before I open my wallet and purchase it?

    Read the article

  • Best practice for ONLY allowing MySQL access to a server?

    - by Calvin Froedge
    Here's the use case: I have a SaaS system that was built (dev environment) on a single box. I've moved everything to a cloud environment running Ubuntu 10.10. One server runs the application, the other runs the database. The basic idea is that the server that runs the database should only be accessible by the application and the administrator's machine, who both have correct RSA keys. My question: Would it be better practice to use a firewall to block access to ALL ports except MySQL, or skip firewall / iptables and just disable all other services / ports completely? Furthermore, should I run MySQL on a non-standard port? This database will hold quite sensitive information and I want to make sure I'm doing everything possible to properly safeguard it. Thanks in advance. I've been reading here for a while but this is the first question that I've asked. I'll try to answer some as well = )

    Read the article

  • Running telnet standalone - possible?

    - by Lanz
    So, this is what I want to do: there is a local non-superuser and it can upload the file into /tmp. Using this account, I download a telnet server package equivalent to what is already installed. I modify some settings, setting all file directories into /tmp. Then compile and run as a standalone telnet server. Is this possible? If not, what makes this impossible? Or as a non-privileged user, would there be any way to enable telnet?

    Read the article

  • What constitutes valid justification for more IP addresses?

    - by David
    I host a small website with a well known VPS service. They provided me with one IPv4 address upon registering and said additional addresses would require justification. I requested one additional IPv4 address so as to have one for a production environment and one for a testing/QA environment. They said this was unnecessary as I could just use alternative TCP ports for the test environment. I can live with using a non-standard port for non-production hosting, but it got me thinking, what would be valid justification? (I asked them and they didn't want to answer). Is there an industry standard for what counts as "valid" justification for additional IPv4 addresses?

    Read the article

  • Linux Best Practices

    - by Zac
    I'm a life-long Windows developer switching over to Linux for the first time, and I'm starting off with Ubuntu to ease the learning curve. My new laptop will primarily be a development machine: 6GB RAM, 320 GB HD. I'd like there to be 2 non-root users: (a) Development, which will always be me, and (b) Guest, for anyone else. I assume the root user is added by default, like System Administrator in Windows. (1) I'd like to mount /home to its own partition, but how does this work if I have two user accounts (Development and Guest)? Are there 2 separate /home directories, or do they get shared? Is it possible to allocate more space for Development and only a tiny bit of space for Guest in GRUB2? How?!?! (2) I'm assuming that its okay that all of my development tools (Eclipse & plugins, SVN, JUnit, ant, etc.) and Java will end up getting installed in non-/home directories such as /usr and /opt, but that my Eclipse/SVN workspace will live under my /home directory on a separate partition... any problems, issues, concerns with that? (3) As far as partitioning schemes, nothing too complicated, but not plain Jane either: Boot Partition, 512 MB, in case I want to install other OSes Ubuntu & non-/home file system, 187.5 GB Swap Partition, 12 GB = RAM x 2 /home Partition, 120 GB I don't have any bulky media data (I don't have music or video libraries, this is a lean and mean dev machine) so having 320 GB is like winning the lottery and not knowing what to do with all this space. I figured I'd give a little extra space to the OS/FS partition since I'll be running JEE containers locally and doing a lot of file IO, logging and other memory-instensive operations. Any issues, problems, concerns, suggestions? (4) I was thinking about using ext4; seems to have good filestamping without any space ceiling for me to hit. Any other suggestions for a dev machine? (5) I read somewhere that you need to be careful when you install software as the root user, but I can't remember why. What general caveats do I need to be aware of when doing things (installing packages, making system configurations, etc.) as root vs "Development" user? Thanks!

    Read the article

  • Samba users not added untill they logon first? Edit: How do I add users to tdbsam without a password prompt?

    - by glisignoli
    I add users to my server with the command useradd -m -p PASS_HASH -s /usr/sbin/nologin USERNAME Then I try to access their samba home share, but it never shows up until I login with the user: root:~$sudo login failtest Password:###### Added user failtest. Is there some way of added the user without logging in? Edit: The problem is that the user is added with the useradd command, but ubuntu seems to run an initalisation script when the user logs on for the first time. This script then adds that user to the tdbsam user database. Finding the initalisation script or the method it uses to add a user to the tdbsam database without requiring any user input (as smbpasswd -a USER prompts the user for a password). So all I need is a way to add a user+pass to the tdbsam database without prompting a user for a password (eg: samaba-add-user.sh USERNAME PASSWORD).

    Read the article

  • Lotus notes 8.5 quota

    - by Cividan
    we're using lotus notes 8.5 and I have a user who was over his quota as he had sent 6 email with attachement over 800 MB (no comment...) I deleted these oversized email and empty the trash but domino keep sending email warning about quota. I checked in the all documents view and they are no longer there, I re-did an empty the trash. I saw a post on the internet saying to compact his database, when I go under file, application, properties and click on the info tab, I see that he use 35.7% of the 3 GB database. when I click on "compact" I see a message saying the compact of the database is beeing process... the message disapear after about 1 minutes the message disapear but nothing else seem to happen and when I look back later on the space problem has not changed. any advice would be appreciated.

    Read the article

  • Why is Server 2012 assigning "169.254.*.*" series when creating DHCP server?

    - by Seth
    I have a small office, with ATT Motorola modem (192.168.1.254) set as passthrough to Dlink DIR-815 (LAN 192.168.0.1) I am trying to setup DHCP server on Server 2012, and when I create new DHCP server, the title is created as 169.254.. instead of the domain name. (Domain clients can retrieve IP's as defined in the scope) Non-domain clients are not receiving IP's from the server but rather the Motorola... How do I assure DHCP setup is properly creating itself, and how do I make sure domain and non-domain clients get IP's from the server?

    Read the article

  • How to elegantly selectively exclude FreeBSD network traffic from OpenVPN interface by port

    - by Polygonica
    inexperienced sysadmin here. I'm planning on running a net daemon inside a FreeBSD jail through OpenVPN, but want to be able to SSH directly into the jail and use the daemon's web interface daemon without going through the VPN. As I understand it, an OpenVPN tunnel is normally set up as a default virtual internet interface, and so incoming traffic will go out on the OpenVPN interface by default (which is problematic, as this incurs latency). I thought "well, obviously, since all of this traffic is leaving on a handful of ports, I'll just redirect those to the non-VPN gateway." I've tried to look for solutions, but almost all of them involve iptables instead of ipfw (which is default for FreeBSD) and solve slightly different problems. And alternate solutions like using multiple default routes to ensure that incoming traffic on any interface is always sent out on the same interface seem far-reaching and require deep knowledge of all tools involved. Is there an elegant way of ensuring that traffic leaving on specific ports exits on a specified non-default interface using ipfw?

    Read the article

  • MySQL gzipped Export in PhpMyAdmin has wrong size in Mozilla

    - by Michal Gow
    That is really strange. I am using PhpMyAdmin 2.11.9.6 on Linux hosting. While I am Exporting databases using "gzipped" compression in Mozilla, I am getting files which have size of uncompressed database, but they seems to be downloading in incredible speed (10 times quicker than is possible using my ISP). So at the end: for database of 10M size I am getting 10M gzip downloaded in miniseconds it has indeed shown 10M size on drive it is corrupted Zip compression is working just fine (I am getting file with cca 1M size with fine content of compressed database) And the weirdest thing: that is happening for Mozilla Firefox (13.0.1) only, Internet Explorer 9 is downloading correct gzipped files... Any hint?

    Read the article

  • rsync to ONLY keep files in destination that have been removed from source

    - by David Corley
    We use rsync to copy filesystem contents from one machine to another as a backup. We first run MACHINE-X-MACHINE-Y rsync for a straight backup with the --delete and --delete-excluded switches We also run an internal Rsync between the MACHINE-Y destination, and another folder on MACHINE-Y with either of the delete flags. This maintains a non-destructive copy in the event someone inadvertently deletes a file on MACHINE-X. However, it also has the overhead of being a complete copy of what has already been synchronized. Ideally I want to be able to run the non-destructive rsync in such a way that the destination ONLY receives the deleted files and so avoids unnecessary duplication . Is there any way to do this?

    Read the article

  • Connecting SQL 2005 to Oracle 10g

    - by Lorn
    Environment: - Oracle 10g database over a windows 32bit 2003 server - SQL 2005 database over a windows 32 bit 2003 server. I am trying to connect the above databases through heterogeneous services. I have updated the following files: TNSNames.ora, Listener.ora and hs.ora. When performing a test connection from SQL developer, I get the following error - ORA 28500 - indicating that the login for SA user is incorrect. I also tried using another authenticated user that has rights to the database. I can successfully connect with SQL 2000. Has anyone experienced such a problem before?

    Read the article

  • Keep Uploaded Files in Sync Across Multiple Servers - LAMP

    - by Dfranc3373
    I have a website right now that is currently utilizing 2 servers, a application server and a database server, however the load on the application server is increasing so we are going to add a second application server. The problem I have is that the website has users upload files to the server. How do I get the uploaded files on both of the servers? I do not want to store images directly in a database as our application is database intensive already. Is there a way to sync the servers across each other or is there something else I can do? Any help would be appreciated. Thanks

    Read the article

< Previous Page | 541 542 543 544 545 546 547 548 549 550 551 552  | Next Page >