Search Results

Search found 24117 results on 965 pages for 'write through'.

Page 513/965 | < Previous Page | 509 510 511 512 513 514 515 516 517 518 519 520  | Next Page >

  • Where can I learn about managing domain names for my websites? [closed]

    - by Shahbaz
    [I originally asked this question on serverfault.com, where it was closed as 'out of scope.' Hopefully it is appropriate for this forum] I am a developer who doesn't understand how to effectively manage Internet domain names. Say I registered a name with namecheap and host a website on linode. Now what is an a-record? What is a name server and do I host it with namecheap of linode? Why would I pay amazon when others are free? Does any of this matter in terms of website latency or reliability? I feel like a script kiddy, copying and pasting others' and hoping it works. Is there a book or other resource that explains all this? I know amazon is full of books about DNS, but afaik they are about setting up DNA servers for local networks, not the Internet. p.s. To emphasize, I'm asking for books or long write-ups which explain this to technically competent people, who just haven't had to think about the role of commercial registrars, name servers, commercial hosts, commercial websites and how all parts play together on the real internet (not local networks).

    Read the article

  • Why is my HDD going back from standy?

    - by Pablo
    My hard drives, connected to Ubuntu server are producing the following log every exactly 5 minutes. Nov 1 14:10:50 localhost kernel: [ 1602.884936] ata2.00: hard resetting link Nov 1 14:10:51 localhost kernel: [ 1603.226804] ata2.01: hard resetting link Nov 1 14:10:52 localhost kernel: [ 1604.274533] ata2.00: SATA link up 3.0 Gbps (SStatus 123 SControl 300) Nov 1 14:10:52 localhost kernel: [ 1604.274548] ata2.01: SATA link up 3.0 Gbps (SStatus 123 SControl 300) Nov 1 14:10:52 localhost kernel: [ 1604.356669] ata2.00: configured for UDMA/133 Nov 1 14:10:52 localhost kernel: [ 1604.375247] ata2.01: configured for UDMA/133 Nov 1 14:10:52 localhost kernel: [ 1604.375265] ata2: EH complete I don't think this is related to hard drive failure, because it happens for ALL hard drives connected and ONLY when I write spindown_time = 12 in /etc/hdparm.conf. The reason I add this value is to put disks into standby mode after 60 seconds, which is happening after that period (checked with hdparm -C). The first clue I thought that smartd was running and spinning the drive. However, I couldn't find it in ps -aux | grep smart. Additionally, iostat does show that nobody accessed those drives, since Blk_read, Blk_wrtn remain unchanged. I also killed all processes that may be doing something with hdd(eg SAMBA). So I guess the problem is solely with hdparm... I have no more clue where that 5 minute value hides.

    Read the article

  • Best use of new express card on Windows

    - by jckdnk111
    I just bought a 48GB SSD express card for my laptop and I am trying to decide how best to use it. I will be running some sort of virtualization (prob VirtualBox) to test / learn Windows Server administration. I am running Windows 7 Ultimate 64 bit. I have 4GB of RAM and a 7200 RPM SATA hard disk. The express card will read at 115MB/s and write at 65MB/s. So how best to use this new disk? Readyboost, relocate pagefile, store VM disks, mix / match?

    Read the article

  • Apache mod_remoteip and access logs

    - by GioMac
    Since Apache 2.4 I've started using mod_remoteip instead of mod_extract_forwarded for rewriting client address from x-forwarded-for provided by frontend servers (varnish, squid, apache etc). So far everything works fine with the modules, i.e. php, cgi, wsgi etc... - client addresses are shown as they should be, but I couldn't write client address in access logs (%a, %h, %{c}a). No luck - I'm always getting 127.0.0.1 (localhost forward ex.). How to log client's ip address when using mod_remoteip?

    Read the article

  • How can I make Bash (or Zsh) run a particular command before each entered command?

    - by Peeja
    I'd like to configure Bash to run a particular command before running each command line I enter at the prompt. Specifically, I'd like to tell Vim (which is running in another terminal) to write all open buffers, because in my workflow if anything's unsaved when I leave Vim it's a mistake. Is there an option for this in Bash? If not, is there an option in Zsh? (There is a readline-based solution that somewhat fits this problem on another question, but it feels a bit hacky. It'll take it as a last resort.)

    Read the article

  • netsh advfirewall firewall

    - by lehn0058
    I am trying to write a script to configure a windows firewall (server 2008 & 2012 only) to adjust certain firewall settings after a machine has been added to a domain. I need to do this because one of the pre-installed programs on these machines we get only has the firewall rules setup for the public and private firewall profile. This script will be pushed out for other admins to use, and some of the machine will be in other languages. The command to change an existing firewall rule is as follows: netsh advfirewall firewall set rule name = "rule name goes here" new profile=domain This command works great. However, I need to do this for about 10 firewall ports AND since the program could be installed on computers with different languages, I can not just pass the name of all of the firewall rules. Is their some way to do this by supplying the port number? Or some way to specify a regular expression so I could use any rule that has a name LIKE 'test'?

    Read the article

  • Excel: How do I copy hyperlink address from one column of text to another column with different text?

    - by OfficeLackey
    I have a spreadsheet where column A displays names in a certain format. There are 200-odd names and each has a different hyperlink (which links to that person's web page). I want to reformat the name order so it is "Surname, Name" rather than "Name Surname" and retain the hyperlink in the newly formatted column. I have achieved "Surname, Name" easily by splitting the names into two columns (using LEFT and RIGHT formulae) - forename and surname - then I have a new column with a formula to return "Surname, Name." However, the hyperlinks are not in that new column and I need them. I don't want to do this manually, for obvious reasons. I cannot find a way of copying just hyperlinks from column A without copying the text from column A. So, effectively, what I need is some sort of macro to take, for example, the hyperlink from A2 and copy it to H2, with H2 still retaining the updated ordering of name. I don't have the knowledge to write this myself, so would appreciate solutions.

    Read the article

  • Make download dialog show up for pictures in IIS 6.0

    - by LinuxGnut
    I have a client that is offering picture downloads from their site. They want the user to have to download the picture, rather than having it appear in the browser when the link is clicked. Inserting the text "Right-click to save as" or something similar isn't an option with this client, as everything needs to be done their way. Now, I know I could accomplish this task in PHP or ASP, but I'd rather not have to write an addition to Magento to accomplish this. So is there a way in IIS 6.0 (Server 2003) to send the correct headers for image formats in that it will make a download dialog pop up?

    Read the article

  • Over 300 "NetBeans Platform for Beginners" Sold

    - by Geertjan
    I've noticed that the authors of "NetBeans Platform for Beginners" have started exposing the number of sales they have achieved. Below, notice the '304' (which will probably change quite quickly) at the lower left end of this screenshot: That's pretty good since the book has only existed for a few months and developers tend to share books they buy in PDF format. That probably means there are 300 teams of software developers around the world who are using the book, which is pretty awesome. (Though it would help the authors significantly, I'm sure, if individual developers on teams would buy the book, rather than sharing one between them. Come on, let's support these great authors so that they'll write more books like this.) Also note that there is a set of reviewer comments on the page above: Plus, the book is updated at the end of each month, so it continues to grow and improve from month to month, for free for everyone who has bought it. If you've read the book and want to contribute a review like the above, contact walternyland @ yahoo dot com. Great work, guys! For anyone out there who hasn't got it yet: https://leanpub.com/nbp4beginners

    Read the article

  • How can I remount an NFS volume on Red Hat Linux?

    - by user76177
    I changed the user id of a user on an NFS client that mounts a volume from another server. My goal is to get the 2 users to have the same id, so that both servers can read and write to the volume. I changed the id successfully on the client system, but now when I look at the NFS mount from that system, it reports the files being owned by the old id. So it looks like I need to "refresh" that mount. I have found many instructions on how to remount, but each seems slightly different according to the type of system. Is there a simple command I can run to get the mounted volume to refresh so that it interprets the new user settings?

    Read the article

  • How to repair a Veritas tape that has been overwritten a bit?

    - by Ismo Utriainen
    I meant to restore some files, but I forgot that there was a monday backup job just waiting for tape loading. So veritas 10d started to write over my tape and that valuable data is now gone. The original data size was about 40 GB and that accidentally started job wrote about 30 MB to the begin of tape. What are my possibilities to recover some data from that tape? Update: Inventory and catalog doesn't help, media settings are overwrite, not append. It is a DLT drive.

    Read the article

  • SSD on Vmware ESXI 4 (TRIM? Good Idea?)

    - by nextgenneo
    Hi, I just posted about finding bottle necks and have narrowed it down to having way too many VMs on my machine on one 15K SAS drive. I have plenty of cores and plenty of ram. So I am planning on putting 6 VMs on one drive (so 5 drives for 30 VMs). I am thinking of using a 60GB Vertex 2 SSD. Each of my VMs will only need about 6GB of HDD space so this isn't a big deal. My questions are: does ESXI support Trim and do I really need it if I leave 25% of the drive as free space? If I need it should I get a diff drive that handles garbage collection differently? I have a RAID controller w/ write caching. I will still benefit from this? Will this effect my setup differently? Is there anything I need to consider regarding SSD's in virtualized environments. Thanks for any and all help!

    Read the article

  • Laptop, unable to install discrete graphics card GTX 880M

    - by FoxyShadoww
    So I've bought the GT70 2PE Dominator Pro a few weeks ago and I installed Zorin OS 9 Ultimate on it. Today I tried to install the Nvidia drivers on my laptop since it has the GTX 880M, but my system became unbootable. Can anyone help me with this issue? I will write down what I've tried so far. This is what I've tried so far: Downloaded the newest Nvidia drivers from their website. Pressed CTRL+ALT+F2 to open the terminal page thingy. Logged in and got root access. Stopped the lightdm service. Ran the NVIDIA-Linux-x86_64-340.32.run installer. Pressed the accept button and right after that it told me the following message: The distribution-provided pre-install script failed! Continue installation anyway?. When I install anyway, it will crash my system and makes it unbootable, Does anyone know how to use my GTX 880M? Do I need to enable it on boot time somehow? Thanks for the support, Sapphire ~

    Read the article

  • Advanced Terminal / Console apps for Mac OS X?

    - by Jakob Egger
    I use a lot of command line programs, very often with similar arguments. Can anyone recommend an application or a workflow that allows me to store often used shell commands and search through my recent commands, using a GUI? I have commands that I use very often (eg. rsync a specific directory to a server) and other commands that I use less often. Creating shell scripts for every code snippet I might reuse seems a bit awkward. Especially for programs that I use seldomly, I end up reading the docs over and over again, because I forgot to write down the exact shell command. Ideally I would like an app that's just like Terminal.app, but provides some kind of history and snippet management. What do you use to keep track of shell commands?

    Read the article

  • Windows Network copy and access denied randomly

    - by The King
    I have a windows 2008 R2 server and I now installed a new bigger HDD into it. I wanted to copy big AVI files to the new server hdd what is shared on the local network. I have write access to the servers hdd and I can successfully copy smaller files to it. But when I copy bigger files more than 500MB randomly on the copy I get Access Deny message. If I use RDP I can copy files through RDP client. I checked error messages at the server but I didn't found any error about this access deny. Because of RDP copy works I don't think that this could be hardware error. I think this is some kind of software setting error. Someone has faced this kind of error? Or somebody has idea what could cause or how to find the root of the problem?

    Read the article

  • Synchronizing 3 servers over IP

    - by user93078
    I'm setting up a medical server for a hospital that has doctors located in 3 different locations, meaning there would be 3 servers (1 in each location). All 3 servers would just have the following software: Ubuntu Server 12.04 minimal MySQL, PHP 5, Apache The medical software which would read/write to the MySQL database Remote admin apps like Nagios & Webmin Rsync for backup (rsync-over-ssh) as a cron job and the doctors at each location would access patient & billing data from their respective servers. What I'd like is, that each of these servers all have synchronized info (especially the mySQL database's) - let's say on an hourly basis each of these servers synchronize data to a common remote server and the data is then brought down to each of the servers. I know an easier way would be to have the medical app running on a remote web server, but since this is medical that we're talking about and knowing how common it is in our area for the net to go gown, I wouldn't like a web based scenatio. Is such a setup possible? Would this be the right way to do things or is there a better way to this? Would really appreciate views and comments (or how to set this up) on this.

    Read the article

  • NFS of NAS server blocks in cluster environment

    - by Zardoz
    In our department we have an Iomega NAS (px4-300d) connected to a Supermicro cluster with 5 nodes (12 cores per node). Each node mounts a share on that NAS by using NFS. Unfortunately after some time (several minutes) of permanent read/write operations (from all nodes) the NAS starts to block and a bit later freezes completely. We tried several options of the mount command, but nothing helped (async, intr, wsize, rsize). The NAS itself doesn't allow many options (better to say none). Do you have any recommendation how to integrate a NAS using NFS in a cluster environment?

    Read the article

  • ffmpeg error while segmenting

    - by Tommy Ng
    I'm using ffmpeg and segmenter on Ubuntu 10.04 to create the transport stream from flv/h264 video files and then segment the ts segments for ipad streaming. Some ts files show an error with segmenter - Output #0, mpegts, to '29': Stream #0.0: Video: 0x0000, yuv420p, 480x360, q=2-31, 90k tbn, 25 tbc Stream #0.1: Audio: 0x0000, 0 channels, s16 [mpegts @ 0x11f4ac0]sample rate not set Could not write mpegts header to first output file my ffmpeg command for creating the ts file - ffmpeg -i 1.flv -f mpegts -acodec libfaac -ar 48000 -ab 64k -s 480x360 -vcodec libx264 -b 192k -flags +loop -cmp +chroma -partitions +parti4x4+partp8x8+partb8x8 -subq 5 -trellis 1 -refs 1 -coder 0 -me_range 16 -keyint_min 25 -sc_threshold 40 -i_qfactor 0.71 -bt 200k -maxrate 192k -bufsize 192k -rc_eq 'blurCplx^(1-qComp)' -qcomp 0.6 -qmin 10 -qmax 51 -qdiff 4 -level 30 -aspect 480:360 -g 30 -async 2 -y 1.ts my segmenter command - segmenter 1.ts 10 1 1.m3u8 path/to/streams/

    Read the article

  • DIsable my nv video card driver in linux

    - by Dahaka Wang
    I'm trying to passthrough my nv video card to my domU, but I could not bind my video card to the pciback driver I only have one video card with the pci number 0000:03:00.0, so I used the following command echo -n "0000:03:00.0" > /sys/bus/pci/drivers/nouveau/bind to unbind the nouveau driver from my video card. The screen went black because I have forcefully removed the video driver, therefore I ssh'd into the computer to run further commands I ran: echo -n "0000:03:00.0" > /sys/bus/pci/drivers/pciback/bind to try to bind it to my pciback driver, but I got: bash: echo: write error: No such device I found out that this was the message shown when trying to bind a PCI device which is already bound. Therefore, I think that something was still using my video card Can anyone help me out? Thanks a lot!

    Read the article

  • Why does IIS refuse to serve ASP.NET content?

    - by Michael Haren
    My Windows Server 2003 Std server refuses to server ASP.NET content. It serves regular html just fine but anything .net, even a one line html file with an ASPX extention fails silently. Things I've tried: Nothing in the event log or IIS WWW logs when it fails. Fiddler shows no response I reinstalled .NET with C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727aspnet_regiis.exe -U C:\WINDOWS\Microsoft.NET\Framework\v2.0.50727aspnet_regiis.exe -I I give obscenely high permissions on everything I can think of (full control, read, write, etc.) to all possibly relevant users (IUSER*, ASP.NET, etc.). I confirmed that ASP.Net v1 and v2 Web Service Extensions are "allowed" in IIS Confirmed that the Server Manager had IIS and ASP.Net roles enabled Again: this is the scenario: http://localhost/Test/Default.htm <-- Works great! http://localhost/Test/Default.aspx <-- Bombs silently with no message at all Any guidance will be much appreciated! Solution: I reinstalled per the instructions below and it works now. Thanks all!

    Read the article

  • How to force ADF to speak your language (or any common language)

    - by Blueberry Coder
    When I started working for Oracle, one of the first tasks I was given was to contribute some content to a great ADF course Frank and Chris are building. Among other things, they asked me to work on a module about Internationalization. While doing research work, I unearthed a little gem I had overlooked all those years. JDeveloper, as you may know, speaks your language - as long as your language is English, that is. Oracle ADF, on the other hand, is a citizen of the world. It is available in more than 25 different languages. But while this is a wonderful feature for end users, it is rather cumbersome for developers. Why is that? Have you ever tried to search the OTN forums for a solution with a non-English error message as your query? I have, once. But how can you force ADF to use English for its logging operations? Playing with your system settings will not help, unfortunately. By default, ADF will output its error messages in the selected locale for the operating system account the application server runs on. The only way to change this behavior is to pass initialization parameters to the JVM used by the application server. It is even possible to specify the language and country/region separately. In the example below, we choose English and the United States respectively. -Duser.language=en -Duser.country=US In the case of WebLogic Server, it is possible to add such parameters in setDomainEnv.sh (or .cmd) to apply the settings to all the managed servers present on a node. In the coming weeks, I will write a few posts about other internationalization issues. Is there anything you would like me to cover? Let me know in the comments.

    Read the article

  • How to start Sharepoint Development - Resources wanted [closed]

    - by user1249641
    I'm a apprentice for software development here in good ol' Germany and I've been doing fairly well with c#, asp.net mvc, entity framework and LINQ so far. My superiors want me to focus solely on our intranet development and Sharepoint development. They don't provide me with any resources to start. No books, no co-workers with actual webpart-dev-experience, seminars and the likes. There for it's do it on your own or die. I bought a book and started working through it on my virtual machine messing with the infrastructure and everything i can get a grip on. My main problem however stays the actual development. I have managed to write 2 webparts which can be used as a rudimentary ticket system(using WSP Builder and SP2007). But there it stops. Are there any comprehensive step by step tutorials or blogs out there, like the asp.net tutorials on www.asp.net which take you by the hand go over each step with you? Starting with the basic classes, going over custom css implementation, Jquery/Javascript ajax and async calls? No matter how trivial, I appreciate every help and hint you can give.

    Read the article

  • Moving to New Machine... also upgrade to 64bit. What steps?

    - by Kendor
    I am about to move to a new Lenovo X201 from current X61. Current setup has separate \home, separate swap file, also separate \Data partition. Am currently running 10.04 32 bit. Am considering running 64 bit on new machine because I will now have 8 GB of RAM. And would like to also move to 10.10. Ideally I would like preserve as much of my current setup as possible... New machine has Win7 on it, but will blow that away, as I've made a clonezilla copy of it, and will use VirtualBox for when I need Windows. Can someone suggest a good step by step for me? I'm networked to a NAS and also have plenty of external USB storage in case I need intermediary steps. So do I set up new machine first with 64bit 10.10, with partition scheme I want? then rsnyc over \home from old machine (over write target home)? Do I need to upgrade the X61 first to 10.10?

    Read the article

  • Encapsulate standard C functions?

    - by Jack Stout
    While studying the C programming language and learning safe practices, I'm inclined to write a layer of functionality over several parts of the standard library. This would serve two purposes: I could use standard parts of the language in ways that feel more familiar or rational to me, and I could easily replace that functionality with my own, if I needed to. I could benefit from this, but should I do it? As an example, we can consider memory management. If I've written malloc() into the constructors of each of my objects, then decide that I need to handle memory allocation on my own, I have to edit the constructor associated with every object. By referencing my own function, I can change the contents of that function without writing a new constructors. It seems obvious that I should do this, but I'm used to Python. I'm extremely comfortable in that environment and have no problem linking to any part of the standard library from any part of my program because I know I will almost certainly leave that relationship untouched for the life of the project. The situation I'm running into with C feels like I'm trying to hide the language from myself. Will writing a layer of functionality over the C standard library help me in learning the language and developing a codebase, or will it stifle my understanding going forward?

    Read the article

  • Open Directory authenticated bind succeeds, but creates incomplete record

    - by Jay Thompson
    I have about a dozen Macs running 10.6.7 or 10.6.8, which are all failing to bind properly to my new 10.7.4 Server OD. I can bind them just fine via Directory Utility or dsconfigldap, and it reports success. However, when I look at the record, it is failing to write the MAC address. Even if I manually update the record with the MAC address, MCX doesn't do anything and clients can't log in to OD accounts. All of the affected clients have hundreds of lines in the /Library/Logs/DirectoryService.error.log like so: 2012-09-15 22:23:18 EDT - T[0x00007FFF70292CC0] - GetMACAddress returned 0x *** bad control string *** 8x I do know that all of these clients were previously managed with the Guest computer account, and I also know that they were all imaged with a DeployStudio image when they were purchased. I've tried dscacheutil -flushcache, but after that I'm drawing a blank. Google has a few hits, but nothing very helpful. Re-imaging would be ideal but probably isn't going to happen. Anyone come across this before?

    Read the article

< Previous Page | 509 510 511 512 513 514 515 516 517 518 519 520  | Next Page >