Search Results

Search found 28593 results on 1144 pages for 'best pratices'.

Page 407/1144 | < Previous Page | 403 404 405 406 407 408 409 410 411 412 413 414  | Next Page >

  • VoIP setup for one external PSTN line

    - by Jcl
    I'm completely new to VoIP and the likes, and I'm trying to find information about what could be the best setup for this. I need 4 (maybe more in the future, but maximum 5 or 6) wireless extensions, connected to 1 PSTN line, and maybe 2 in the future. I've been trying to gather information about the gear needed but everything I find seems too much over-the-top (and extremely expensive). The main problem is that the physical place we are on doesn't have possibilities of having a decent internet connection, so using a external VoIP "virtual PBX" is not an option. Thing is, even if small, phone is critical to this organization. I currently have an analog DECT/GAP PBX which does what I need, however the PBX is very bad and the call quality is horrible, and that's why I want to change it. The requirements would be: 4 wireless terminals (routing cable is not an option), all of them ringing on incoming PSTN calls. Ability to do internal calls (4 separate offices) and ability to pass calls between terminals. The 4 terminals should be able to access the external PSTN line without dialing any special codes. Very important: terminals should be able to issue commands on the PSTN line to the external operator in the form *nn*nnnnnnnn# . Don't know wether this could face to be a problem, but I've had problems with analog PBX which would take any * as a PBX command and wouldn't allow terminals to send it to the external lines. Not so important, but would be nice to have: call waiting music Could anyone recommend such a setup? I need to be able to do this on a EXTREMELY LIMITED budget (that is: I don't have a limit, but all should get as much to zero as possible). I have enough spare powerful computers and a 300mbps wireless network which works just fine, so that's not to include in the budget. Don't really know if this is the best place to ask, but it's the most StackExchange-related site I've found to this subject.

    Read the article

  • copying folder and file permissions from one user to another after switching domains [closed]

    - by emptyspaces
    Please excuse the title, this was the best way I could think to describe this scenario without an entire paragraph. I am using C#. Currently I have a file server running windows server 2003 setup on a domain, we will call this oldDomain, and I have about 500 user accounts with various permissions on this server. Because of restrictions out of my control we are abandoning this domain and using another one that is more dominant within the organization, we will call this newDomain. All of the users that have accounts on oldDomain also have accounts on newDomain, but the usernames are completely different and there is no link between the two. What I am hoping to do is generate a list of all user accounts and this appropriate sid's from AD on the oldDomain, I already have this part done using dsquery and dsget. Then I will have someone go through and match all of the accounts from oldDomain to the correct username on newDomain. Ultimately leaving me with a list of sids from oldDomain and the appropriate username from newDomain. Now I am hoping to copy the file and folder permissions from the old user from oldDomain to the new user on newDomain once I join the server to newDomain. Can anyone tell me what the best way to copy permissions from the sid to the user on newDomain? There are a bunch of articles out there about copying permissions from user a to user b but I wanted to check and see what the recommended practice is here since there are a ton of directories.

    Read the article

  • Optimum configuration of McAfee for Servers

    - by Wayne Arthurton
    Our corporate standard is McAfee Enterprise, unfortunately this is non-negotiable. On two types of servers I'm responsible for, SQL & Web, we have noticed major performance issues with the corporate standard setup. Max scan time 45sec One policy for all processes Scan ALL files on write, read and open for backup Heuristics: Find unknown programs, trojans and macros Detect unwanted programs Exclude: EVT, LDF, LOG, MDF, VMD, , windows file protection) This of course still causes major slowdowns. IIS .NET recompiles are slow especially with SharePoint, SQL backups and restores, SQL Analysis Services, Integration Services and temp data from them as well. I have looked from time to time, for some best practices on setting up McAfee of SQL & SQL Analysis Service, SQL Integration Service, Visual Studio, Sharepoint, and .NET web servers in general. How do people setup McAfee enterprise on their corporate serves keeping security intact, but affecting performance as minimally as possible? Has anyone run across white papers on these setups? Obviously some are case by case, but there must be some best practices out there somewhere.

    Read the article

  • Bad results converting PDF to EPS on Linux

    - by Tim
    I'm having some trouble converting PDFs (created by Adobe Illustrator on a Mac) to EPS. I have tried several things but I am wondering if there is a better option. The following list is ordered by decreasing quality: inkscape --export-area-page --export-eps=out.eps in.pdf using the graphical program Inkscape works best, but is a bit slow; pdftops -eps in.pdf out.eps uses Poppler and works good and is fast; pdf2ps in.pdf out.eps uses ghostscript and works ok for simple documents; convert in.pdf out.eps uses ImageMagick and always rasterizes the image. I haven't tested the following: acroread -toPostScript use acroread (Linux only) Some issues I've found: Transparency is not supported in EPS, but instead of flattening the layers, most programs rasterize the image producing big files and ugly graphs. Inkscape does this best by only rasterizing the unsupported area. Gradients are rendered properly by Inkscape, but Poppler somehow chops up the gradient into many shapes of different colors. Greek symbols are seemingly not supported by Ghostscript and are rasterized (using pdf2ps). What are your experiences for this kind of task? Did I forgot certain programs and/or command line options that improve quality? I found some posts on this, but not a (thorough) comparison of possibilities, please correct me if I'm wrong. Related posts How to convert PDF to EPS? on TeX

    Read the article

  • Are Windows Domain Service Accounts Really Necessary?

    - by Zach Bonham
    One of the biggest problems we have in automating application deployments is the idea that running IIS AppPools and Windows Services under domain service accounts is a 'best practice'. Unfortunately, this best practice sometimes causes deployment headaches in that either we need to provision a new domain level service account quickly, or once we have the account, we now need to manage the account credentials. I had a great conversation about not making domain level service accounts a requirement and effectively taking one of two approaches: Secure at the node level using machine account(domain\machine$) and add the node to appropriate ActiveDirectory/Sql groups/roles Create local app specific accounts on each machine (machine\myapp) and add that account to appropriate ActiveDirectory/Sql groups/roles (the password here can change per deployment, it doesn't need to be stored) In both cases, it seems that its easier to manage either adding an account to appropriate group/role, or even stand up new, local account, than it is to have to provision a new domain level account and manage those credentials. This would hopefully ease the management burden on ActiveDirectory, Sql Server and Operations teams as there would be no more password management. We've not actually been able to implement this in practice yet. I am coming from a development background, so I'm curious as to how many ways this approach could go wrong? Can we really get rid of domain level service accounts with this direction? I'd appreciate any thoughts from anyone who has taken this path! Thanks! Zach

    Read the article

  • Development on Windows 7; Web server on Linux - How to share Apache web root?

    - by TheKeys
    I've got a LAMP server that I want to use as a local web server. I've got a Windows 7 machine that I want to use as my development machine. The machines will be on the same LAN (or the Windows box will be VPNed into the LAN). My questions is, what is the best way of sharing the web root of the LAMP server so that I can edit the files on the remote Windows 7 machine and how do I go about configuring this on the Linux machine? (Fedora 16) I would like the solution to be as easy to use as possible with preferably no extra steps required to save/edit/upload files from my IDE on my Windows 7 machine. I'm thinking either a Samba or NFS share are the way to go but I'm concerned I'm going to run into issues with permissions and unix/windows file handling. Is one better than ther other for my use case or is there a better alternative solution? I'm currently using Windows 7 Professional which doesn't have NFS support but would upgrade to Ultimate which does have NFS support if it's the best solution.

    Read the article

  • Bluehost Emails Getting Blocked

    - by colithium
    A site for my client has the run-of-the-mill "website with users" email pattern. Create an account, get an activation email. Get an email when a subscription is expiring, etc. The site is hosted on Bluehost and currently it uses php's mail() function. There isn't much configuration that is allowed (as far as I know). The trouble is, about a third of these emails disappear into the void. They aren't in spam or junk folders, there's no bounce message, they just cease to exist. I've read about Bluehost email troubles but I can't figure out what my options are for fixing it. These aren't marketing emails, ie they have user-specific information contained within them. I suppose if a solution offers a good templating system that would be fine. What are my options? Excerpt of headers when delivered to a Gmail address: Received-SPF: neutral (google.com: 00.000.000.000 is neither permitted nor denied by best guess record for domain of domain@box###.bluehost.com) client-ip=00.000.000.000; DomainKey-Status: good Authentication-Results: mx.google.com; spf=neutral (google.com: 00.000.000.000 is neither permitted nor denied by best guess record for domain of domain@box###.bluehost.com) smtp.mail=domain@box###.bluehost.com; domainkeys=pass [email protected]

    Read the article

  • Alternatives to using email (in particular, Outlook) as a knowledge store?

    - by Umber Ferrule
    I suspect that, like many people, I use my work email account (accessed via Outlook 2007) to store information. I generally try to group similar things in folders and sub-folders, but with a multitude of folders this gets very unwieldy. In particular, it can be a bind to locate things using Outlook's tree structure. (As an aside: I've yet to come across a good free search add-on for Outlook.) I realise Outlook is not the best place to store all my information and I'd prefer not to. In an ideal world I'd like to be able to organise all of the information stored in Outlook in a MindMap (my software of choice being Freemind) or Wiki. To maintain an email audit-trail, I've considered saving individual emails as files using a MindMap or Wiki to link them. What do people think of this? (I can't say I relish the thought of the exporting process!) Whatever I do is going to involve some pain (i.e. setting up a Wiki/MindMap) or sticking with what Outlook provides currently. Has anyone been in the same position? Has anyone mass-migrated information from Outlook? If so, what was the best way? Any ideas or alternative proposals?

    Read the article

  • Will Intel be releasing anymore 6-core processors soon?

    - by jasondavis
    I am about to start buying parts every week for as long as it takes me to build the best PC I can build. I am looking at the Intel i7-920 processor right now because it is about 250$ and it is a quad-core processor based on the x58 chipset I believe. From what I have read so far, intel is coming out with some 6-core processors soon that will also use the x58 chipset and will allow me to use the same motherboard and memory/ram to upgrade to a 6-core. This sounds really good to me right now. I just read that the new 6-core processor. The Core i7-980X (extreme edition) was just released which is the first 6-core processor but it is supposed to be around $1,000 so I will probably just get the i7-920 for now and then upgrade to the 6-core version when the price goes down. The motherboard I am looking at getting the GIGABYTE GA-X58A-UD5 which is around $280 at newegg.com So that is my basic plan SO far. I have not purchased any parts yet. I am just wanting to ask if this sounds like a good idea or if I should wait longer if I am wanting to eventually have a 6-core processor. Does anyone know if Intel is planning on releasing any other 6-core processor in addition to the Core i7-980X in the near future? I just want to make sure I am buying the best setup for my money if I am going all out on it, thanks for any tips/advice.

    Read the article

  • Cache updates when migrating DNS from one provider to another

    - by JohnCC
    This may be a Windows DNS specific question or a general DNS best practice question - I'm not sure! We migrated our 3rd party DNS provision from provider A to provider B. I noticed that our internal recursive windows DNS servers still had NS records cached for our domains pointing to provider A's servers, even though I changed the nameservers with our registrar several days ago, and even though selecting the properties of the cached records showed a TTL of 1 day. After 24 hours when the NS records in this cache have expired, will the DNS server go back to the TLD server for an update on the authority, or will it go by preference to dns1.providera.com since that is what it has cached? In this case I arranged to leave Provider A's servers up for a week to allow changes to propagate, so dns1.providera.com is still active and would still provide NS and SOA records that said that dns1.providera.com. was in charge of this domain. Given this fact, would the Windows DNS server ever go back to the TLD and pick up the authority changes, or would it just assume all was well and renew timestamps on its cached NS records? I wonder what would be the best approach to ensuring that caches pick this up. Should I:- (1) Leave Provider A's servers in place and active and wait for caches to catch up ... basically what we're doing now which seems to have issues - perhaps specifically for Windows servers, or perhaps more widely. (2) Leave Provider A's servers in place but change the NS and/or SOA information they provide to tell caches that new servers are in charge. (3) Remove Provider A's servers after 2*TTL to force remaining caches to update. The issue with (2) is that on Provider A's system I can't seem to change the NS or SOA information to anything other than their servers. The issue with (3) is that I'm not sure how a DNS server would behave in this case. When it couldn't reach the cached name servers, would it flush its cache and try a full recursive lookup, or would it just return an error, forcing the user to clear the cache manually? Thanks in advance!

    Read the article

  • Does the Lenovo t60p vga port support an s-video signal?

    - by Matthijs Wessels
    I just bought a new television. The problem is it turns out it doesn't have a VGA port. It does have: s-video, component, hdmi and scart. My Lenovo t60p only has vga. If have search frantically for a solution and even though it seems I have sooo many options they are all dead ends. Or I keep ending up having to buy a 100 euro box to convert the signal. However, I found that some video cards support s-video through the vga port. It says look it up in your video cards documentation. I have a Lenovo t60p laptop with a ATI MOBILITY FireGl v5250. But I can't seem to get my hands on any documentation where this is supposed to be documented. I found this website: http://forum.notebookreview.com/showthread.php?t=179529&highlight=s-video There this guy says he thinks it's in the t60 but dropped in the t61, but suggests to the guy with the t60 that it won't work. I can't really conclude anything from that. Furthermore, I am not looking for the best of the best quality. So when I found this: *http://www.amazon.com/VideoSecu-Computor-Presentation-Converter-VGA2TV/dp/B000X3FAJU/ref=pd_cp_e_3_img I woudl be quite happy with this. Except that I don't think I can order it because I don't live in the US. Can anybody give me a definite answer, to whether the vga port of my lenovo t60p ati firegl v5250 supports s-video? So that I can just by a vga to s-video cable to achieve my goal.

    Read the article

  • Using UDF on a USB flash drive

    - by CesarB
    After failing to copy a file bigger than 4G to my 8G USB flash drive, I formatted it as ext3. While this is working fine for me so far, it will cause problems if I want to use it to copy files to someone which does not use Linux. I am thinking of formatting it as UDF instead, which I hope would allow it to be read (and possibly even written) on the three most popular operating systems (Windows, MacOS, and Linux), without having to install any extra drivers. However, from what I found on the web already, there seem to be several small gotchas related to which parameters are used to create the filesystem, which can reduce the compability (but most of the pages I found are about optical media, not USB flash drives). I would like to know: Which utility should I use to create the filesystem? (So far I have found mkudffs and genisoimage, and mkudffs seems the best option.) Which parameters should I use with the chosen utility for maximum compability? How compatible with the most common versions of these three operating systems UDF actually is? Is using UDF actually the best idea? Is there another filesystem which would have better compatibility, with no problematic restrictions like the FAT32 4G file size limit, and without having to install special drivers in every single computer which touches it?

    Read the article

  • How to configure multiple virtual hosts for multiple users on Linux/Apache2.2

    - by authentictech
    I want to set up a virtual hosting server on Linux/Apache2.2 that allows multiple users to set up multiple website domains as would be appropriate for commercial shared hosting. I have seen examples (from my then perspective as a shared hosting customer) that allow users to store their web files in their user home directory with directories to correspond to the virtual host domain, e.g.: /home/user1/www/example1.com /home/user2/www/example2.com instead of using /var/www Questions: How would you configure this in your Apache configuration files? (Don't worry about DNS) Is this the best way to manage multiple virtual hosts? Are there others? What safety or security issues do you think I should be aware of in doing this? Many thanks, folks. Edit: If you want to only answer question 1, please feel free, as that is the most urgent to me at this moment and I would consider that an answer to the question. I have done it for myself since posting, but I am not confident that it's the best solution and I would like to know how an experienced sysadmin would do it. Thanks.

    Read the article

  • Optimal Networking Setup for a 2-Story unit?

    - by user29336
    I am moving into a 4 bedroom two-story unit. It’s roughly 2,200 sq ft. I want absolute max throughput possible to be achieved in all focal points. We’re all in internet related industries. Between gaming and web-development latency and throughput are major factors for us. Here’s our main focal points: 1) Garage (office). downstairs 2) Each bedroom x4. upstairs 3) Living room. downstairs The fastest line we can get is Comcast 50mbdown/5up (Wideband). I am looking for the best way to achieve wireless and wired performance for our setup. Our gaming computers may be in our bedroom, and we also may bring it down to the office every now and then for “LAN” sessions. Most wireless will be happening downstairs with our laptops, but since we may do LAN sessions then hard wired latency may be important there too. My concerns: If we do only wireless there would be too much latency for gaming. I don’t know if placing one D-link DGL 4500 on the top floor would be enough; which I currently own. (http://dlink.com/us/en/home-solutions/support/product/dgl-4500-xtreme-n-gaming-router) As far as I’m aware wireless signals transfer best top down. Would this wireless router be enough on top floor and that’s it? My second strategy was a combination of wiring and wireless but I’m not sure what’s easiest way to do this? This is a place we’re renting, so I’m not sure how much leeway we have with wiring, but we’re all pretty competent... if we can’t drill through a wall we can probably “stitch” them across the edges wherever needed. Thoughts on the optimal way to do this?

    Read the article

  • Expanding to dual video cards

    - by Anthony Greco
    I know a lot of factors can go into play here, so I will list my current hardware and setup: MOBO: GIGABYTE GA-890FXA-UD5 [http://www.newegg.com/Product/Product.aspx?Item=N82E16813128441] Processor: AMD Phenom II X6 1090T Black Edition Thuban 3.2GHz [http://www.newegg.com/Product/Product.aspx?Item=N82E16819103849] Ram: G.SKILL Ripjaws Series 16GB (4 x 4GB) [https://secure.newegg.com/NewMyAccount/OrderHistory.aspx?RandomID=4933910872745320111128011418] Current video card: EVGA 01G-P3-1366-TR GeForce GTX 460 SE [http://www.newegg.com/Product/Product.aspx?Item=N82E16814130591] OS: Windows 7 Ultimate x64 Currently I can run 2 monitors just fine in my setup. However, I want to upgrade this to 4 monitors. My question is, what is the best way to do this? I remember in the past reading I need the same type of video card, however would any GeForce GTX work, or do i need that very specific model (EVGA 01G-P3-1366-TR GeForce GTX 460 SE)? Are there any issues I should be aware of before I order 2 new monitors and a video card? Are there video cards better setup for this? I know NVidia offers SLI, however I do not know if my mobo is compliant. My mobo also offers CrossFireX configuration, though from what it says only Radeon are compliant. Any suggestions / feedbacks on my best route with my current setup is appreciated. Even if you suggest buying 2 new identical video cards, as long as you can mention which and why that is better I really appreciate it. Note: I really do not do any gaming. I sometimes do some 3D work in Unity and very rarely in Maya. Besides that I mostly do all my computer work in Visual Studios and Photoshop. I however need the 2 extra monitors because I monitor sometimes 5 remote desktops at once and switching on only 2 is becoming a very big pain. Also seeing 3 side by side while I work on the 4th will be very helpful. Again, I appreciate any feedback, as I have googled a bunch and just want to make sure what I buy will work.

    Read the article

  • Will these instructions work when turning of journaling on a n ext4 SSD?

    - by snowlord
    I have an Acer Aspire One with an SSD for storage. I recently installed Ubuntu on it and chose ext4 for my filesystem. Then I read that journaling on an SSD isn't the best idea, so I will try to disable journaling and I have found these intstructions (from http://fenidik.blogspot.com/2010/03/ext4-disable-journal.html): # Create ext4 fs on /dev/sda10 disk mkfs.ext4 /dev/sda10 # Enable writeback mode. This mode will typically provide the best ext4 performance. tune2fs -o journal_data_writeback /dev/sda10 # Delete has_journal option tune2fs -O ^has_journal /dev/sda10 # Required fsck e2fsck -f /dev/sda10 # Check fs options dumpe2fs /dev/sda10 |more For more performance add fstab opions: data=writeback,noatime,nodiratime i.e: /dev/sda10 /opt ext4 defaults,data=writeback,noatime,nodiratime 0 0 I will use them on my boot partition. Are there any particularly bad parts here, or are there any missing steps? Will my boot partition be fit for being on an SSD after this? Or should I consider switching to ext2, or even reinstall it all and choose ext2 at partitioning time (I'd rather not though, since I've configured quite some stuff already)?

    Read the article

  • Linux/Unix in Windows

    - by Dmitriy Nagirnyak
    Hi, What would be the best way to get the full-blown Unix/Linux bash inside Windows? I don't mean the Virtual Machine, but rather only the terminal with mounted NTFS drives. This way I could use the power of Unix/Linux still being on Windows. The things I want to be able to do from the terminal: Package management (apt-get in Debian). SSH. File operations (including grub and similar). Run a web server (Apache, nginx) for testing purposes. Easy to use: start terminal - Linux is on, end terminal - Linux is shut down. Would be nice to be able to copy-paste from Windows into Terminal and vice versa. This really feels like a separate OS and I realize that VM would, probably, be the best thing. But I guess it should be possible to have a lighter installation. THE NOTE: I cannot just use Linux because of I still need to do development on Windows. Also I am a Linux noobie - just getting started with it so sorry if asking something obvious/stupid. Thanks, Dmitriy.

    Read the article

  • Managing multiple IMAP accounts in Thunderbird

    - by baritoneuk
    I've been using Thunderbird for years without issues with 20+ pop3 accounts. I'm moving over to imap which will enable me to keep copies of the emails locally and on the server whilst keeping everthing synchronised. However I'm looking for the best way to manage multiple imap accounts on Thunderbird. Currently I have a filter that copies all the emails into a central inbox and into seperate local folders. The reason for this is I go through my inbox daily and delete all emails that don't require any action. I move any emails that require action to my "action" imap account folder. This way I can syncronise all the emails that require action across multiple computers (and mobile devices). This technique is my implemantion of the GTD or Getting things Done philosophy. I also copy over each email into seperate local folders. The reason I do this is just in case any emails on the imap accounts get deleted, or something drastic happens on the server which means I lose all the emails. My business partner has access to some of these emails and still uses pop3 (with "leave copy on server" checked), but I know sometimes Thunderbird can still delete emails off the server sometimes. The problem with the above is that thunderbird gives me the dreaded error dialogue saying that the emails cannot be filtered due to another process. I find the folder list in Thunderbird hard to manage. Here is a screenshot of part of my folder list- as you can see it's a bit of a complicated list and not easy to manage: What would be the best way of me managing multiple imap accounts whilst allowing me to have copies put in a central folder and emails in local folders? It would be useful if people think this is necessary, as perhaps there is a betterway? How do people manage multiple imap accounts in a way that allows them to keep on top of actionable emails? I'd be interested in how others manage this. I've never used the Thunderbird-based client "Postbox", does this handle multiple imaps better?

    Read the article

  • Error while installing boost_1_54

    - by Farhat
    On trying to install boost I get this error during configuration checks. Googling did not give any pointers. [root@heracles boost_1_54_0]# ./b2 install Performing configuration checks - 32-bit : no (cached) - 64-bit : yes (cached) - arm : no (cached) - mips1 : no (cached) - power : no (cached) - sparc : no (cached) - x86 : yes (cached) error: No best alternative for libs/coroutine/build/allocator_sources next alternative: required properties: <link>static <target-os>windows <threading>multi not matched next alternative: required properties: <link>static <segmented-stacks>on <threading>multi not matched next alternative: required properties: <link>static <threading>multi not matched - has_icu builds : no (cached) warning: Graph library does not contain MPI-based parallel components. note: to enable them, add "using mpi ;" to your user-config.jam - zlib : yes (cached) - iconv (libc) : yes (cached) - icu : no (cached) - icu (lib64) : no (cached) - compiler-supports-ssse3 : yes (cached) - compiler-supports-avx2 : no (cached) - gcc visibility : yes (cached) - long double support : yes (cached) warning: skipping optional Message Passing Interface (MPI) library. note: to enable MPI support, add "using mpi ;" to user-config.jam. note: to suppress this message, pass "--without-mpi" to bjam. note: otherwise, you can safely ignore this message. error: No best alternative for libs/coroutine/build/allocator_sources next alternative: required properties: <link>static <target-os>windows <threading>multi not matched next alternative: required properties: <link>static <segmented-stacks>on <threading>multi not matched next alternative: required properties: <link>static <threading>multi not matched - zlib : yes (cached) How can the alternative for allocator sources be located? Thanks.

    Read the article

  • Windows Server 2008 (Web Server) Replication

    - by justjoshingyou
    We have a load balanced environment with Windows Server 2008. What are some best practices to setting up replication across the web servers? Do I only want to replicate the web folders? How about replicating IIS changes - or do I need to make IIS changes on every server? I've never, ever set up replication, but I have worked with a web farm that used it before. Basically, I only know the basics about how it works, and am looking for any advice, guides, warnings, etc on setting this up. If you'd like to offer any advice, I'll let you know how our environment is for now. We have 1 prod server up and the second is nearly ready to go. We are using a cloud system and all machines are VM's. I am in the process of setting up the domain controller now (as I need to have one for DFS). Any ideas on the best way to go about setting up replication? Should we just stick the prod server in from the start or set up using a test VM and our second server and then switch it up later? I do not want to risk overwriting our prod server. Thanks!

    Read the article

  • Using UDF on a USB flash drive

    - by CesarB
    After failing to copy a file bigger than 4G to my 8G USB flash drive, I formatted it as ext3. While this is working fine for me so far, it will cause problems if I want to use it to copy files to someone which does not use Linux. I am thinking of formatting it as UDF instead, which I hope would allow it to be read (and possibly even written) on the three most popular operating systems (Windows, MacOS, and Linux), without having to install any extra drivers. However, from what I found on the web already, there seem to be several small gotchas related to which parameters are used to create the filesystem, which can reduce the compability (but most of the pages I found are about optical media, not USB flash drives). I would like to know: Which utility should I use to create the filesystem? (So far I have found mkudffs and genisoimage, and mkudffs seems the best option.) Which parameters should I use with the chosen utility for maximum compability? How compatible with the most common versions of these three operating systems UDF actually is? Is using UDF actually the best idea? Is there another filesystem which would have better compatibility, with no problematic restrictions like the FAT32 4G file size limit, and without having to install special drivers in every single computer which touches it?

    Read the article

  • Offline cache copies in Windows file sharing

    - by netvope
    I frequently access media files (music or video) on a remote Windows file share. My Internet connection is not very fast, and I find it a waste of bandwidth when I repeatedly access the same files. For example, I may listen to the same song 30 times in a month. So, I would like to cache files I've used. I know Windows has an "Always available offline" feature but I dont' think it suit my needs. I don't want to make the whole share "available offline" as the remote Windows file share is huge (in terabytes). Making individual files "available offline" is tedious as the files are scattered in many different directories. It would be much more convenient if the system can simply cache those I've used. I could also manually make a local copy each time I use a file... but this is even more troublesome than making each file "available offline" Also The files on the share seldom change. Many of the files are rarely used. Some of the files are frequently used. I don't have a list of the most frequently used files. It would be the best if I could tell Windows to cache the last accessed 10GB, but apparently it doesn't have this feature. So I think the best way is to have a SMB/CIFS caching proxy. What do you think? I have a Linux box sitting around. Perhaps I should try to setup samba4?

    Read the article

  • tod to avi mpg wmv, convert tod (.mpeg-2) to avi mpg wmv for Movie Maker.

    - by yearofhao
    Need to convert .tod (mpeg-2) to avi mpg wmv download from JVC Everio to PC with tod to avi mpg wmv converter convert tod to uncompressed/raw avi, mpg wmv Have a JVC Everio camcorder? Then you may encounter problems when saving the .tod files to your computer windows movie maker says it can't recognize and edit them to make videos. You may play them using media player but the problem is how to edit them? The bundled software Power cinema could be annoying, since you can only edit when the camera is plugged in to the PC - Power cinema can’t seem to edit from the saved clips alone. So, how do you save them to PC so that you can edit them without the camera and also using windows movie maker? JVC Everio Tod to avi mpg wmv converter costs you a penny to but help you perfectly convert tod file to AVI, MPG, WMV, YouTube FLV, MP4, DV, QuickTime.MOV or other common video formats with fast speed and while keeping the original HD quality. High definition TOD recordings from JVC Camcorders can playback fluidly, convert smoothly and edit professionally on with iOrgsoft TOD file converter iOrgsoft tod to avi mpg wmv Converter has been mostly used by Windows users who use Windows 7 or Vista, after the .tod (mpeg-2 the same codec) downloaded from JVC Everio to PC, it’s best to convert tod to avi, convert to xvid divx, convert tod to uncompressed avi or convert tod to raw avi, tod to mpg, tod to wmv, which are three Windows movie maker best formats to import. TOD to avi mpg wmv converter is a competent video-editing program that allows you to clip/cut TOD video clip, crop the video to encode, and help transfers video to devices like iPhone, iPod, to HDTV connected with Apple TV.

    Read the article

  • Using Toshiba 22EL833 as PC display through HDMI input

    - by Oleg V. Volkov
    I had another Toshiba TV - 19SL738 - connected to this same PC and video card (GTX 8800) through DVI<-HDMI (DVI on PC side, HDMI on TV) before, that was working perfectly at it's native resolution 1360x768. Some time ago I had to change to 22EL833 and immediately faced problem with Windows 7 control panel and NVIDIA control panel both reporting native resolution for new TV as 1080i, 1920x1280, despite TV documentation saying that it have same 1360x768 as previous one. Practical tests confirmed that true native resolution is indeed 1360x768, because plugging in through DVI<-VGA and setting custom resolution through NVIDIA panel shown clear colors and crisp image, while setting anything different with either DVI<-VGA or DVI<-HDMI produced horribly distorted or squished images, with almost unreadable slim lines (as in letters, for example). Now, my problem is that there's no drivers for this TV and I'm unable to get good image while connecting it through DVI<-HDMI directly. The best I've achieved is editing EDID/driver manually, to persuade system that native resolution should be 1360x768, and while image became mostly clear, colors turned to some strange washed out effect, with pools of pure yellow, cyan and magenta there and there filling place of other colors. Gradients also became noticeably stripped as well. Somehow it looks like dithering gone bad and makes me suspect that image is still down/upscaled several times internally somewhere along the line. How can I connect this TV to DVI output of my video card to get best possible clear image, correct colors and correct native resolution?

    Read the article

  • How to open semicolon delimited CSV-files in US-version of Excel

    - by Holgerwa
    When I double-click on a .csv file, it is opened in Excel. The csv-files have columns delimited with semicolons (not commas, but also a valid format). Using a German Windows/Excel setup, the opened file is displayed correctly, the columns are separated where the semicolons existed in the csv-file. But when I do the same on an (US-) English Windows/Excel setup, only one column is imported, showing the whole data including the semicolons in the first column. (I don't have an English setup available for tests, users have reported the behavior) I tried to change the list separator value in Windows regional settings, but that didn't change anything. What can I do to be able to double-click-open those CSV-files on an English setup? EDIT: It seems to be the best solution not to rely on CSV-files in this case. I was hoping that there is some formatting for CSV-files that makes it possible to use them internationally. The best solution seems that I'll switch to creating XLS-files. Thanks to all for your suggestions and helpful tips!

    Read the article

< Previous Page | 403 404 405 406 407 408 409 410 411 412 413 414  | Next Page >