Search Results

Search found 19788 results on 792 pages for 'remote host'.

Page 562/792 | < Previous Page | 558 559 560 561 562 563 564 565 566 567 568 569  | Next Page >

  • Cheap server stress testing

    - by acrosman
    The IT department of the nonprofit organization I work for recently got a new virtual server running CentOS (with Apache and PHP 5), which is supposed to host our website. During the process of setting up the server I discovered that the slightest use of the new machine caused major performance problems (I couldn't extract tarballs without bringing it to a halt). After several weeks of casting about in the dark by tech support, it now appears to be working fine, but I'm still nervous about moving the main site there. I have no budget to work with (so no software or services that require money), although due to recent cut backs I have several older desktops that I could use if it helps. The site doesn't need to withstand massive amounts of traffic (it's a Drupal site just a few thousand visitors a day), but I would like to put it through a bit of it paces before moving the main site over. What are cheap tools that I can use to get a sense if the server can withstand even low levels of traffic? I'm not looking to test the site itself yet, just fundamental operation of the server.

    Read the article

  • Not able to track traffic on subdomain using Google Analytics

    - by Steven
    I'm trying to track traffic for my sub-domain, but it's not happening. This is how it's set up. My partner has a domain called sub1.partner.com. This domain points to partner1.mydomain.com. The idea is that users think they are browsing my partners website, when they are in fact browsing pages on my server. My tracking code looks like this: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-xxxxxxxx-x']); _gaq.push(['_setDomainName', '.mysite.com']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })(); In Google analytics I've created a new account under my main account and called in partner1.mysite.com. On this account I have created a filter: Filter type: include Filter field: Host name Filter pattern: partner1.mysite.no Case sensetive: No What more can I try to track traffic on my subdomain? UPDATE Question 1 Is this line correct? _gaq.push(['_setDomainName', '.mysite.com']); Question 2 Is it correct that I have to add \ before any punctuations like so \. in filters?

    Read the article

  • How to create shared home directories across multiple computers?

    - by Joe D
    I know there are ways to share a folder across computers making it easy to move files. But I was wondering how one would setup a single login which lets you access the same files regardless of which machine you login on? What I would like is something similar to something you would see in a college campus where students login on machines in the lab and see their files regardless of which machine they use. I know there are server involved here. I have a need to create this on a smaller scale where we have a few computers available (and one of these could act as the server if needed and host the files) that every one shares. Note, the specific install of software might be different on each computer but the login and OS are the same. Since some computers have additional capability that our group members will need to use at rotating schedules (software licenses or hardware components, etc.). I have not done this before, so I would appreciate detailed instructions if possible or a reference to a guide that describes this. Thanks in advance.

    Read the article

  • VMware9 fails to launch Virtual OS. Ubuntu 12.10 64bit (Unable to change virtual machine power state: Pipe connection has been broken.)

    - by pst007x
    Another issue I need help with. I use VMware for work on Ubuntu 12.04. However for some reasons my VMware Workstation software generates the following error: Unable to change virtual machine power state: Failed to power on '/home/pst007x/vmware/Windows 7 x64/Windows 7 x64.vmx'. Transport (VMDB) error -14: Pipe connection has been broken. Product: VMware® Workstation Version: 9.0.0 build-812388 Host OS Version: 3.5.0-18-generic Ubuntu 12.10 64bit Kernel Linux 3.5.0-18-generic I have seen patches, but nothing works, and the patches are not maintained, broken links. PLEASE NOTE: I know that this has been asked, but there was no answer given that resolves this issue. Unfortunately I have changed all my office and personal PC's to Ubuntu, and use Win7 in a vm, however since 12.10, vm now fails to launch. For me this is a catastrophe, and makes Ubuntu useless for me to use! I am in a desperate situation here, is there is anyone that can offer any help I would be truly appreciative. I have looked on the VMware forums and the only solution is offered for Fedora, but the people posting there are not clear about what the solution is! thanks PLEASE DO NOT CLOSE THIS QUESTION! I have tried everywhere to find a solution, but I have come up blank... Patches referenced to in other posts do not work in this version of VMware...

    Read the article

  • Hosting files with support for file tagging / keywords

    - by Zev Chonoles
    I have a large (approx. 25GB) collection of files I would like to host online for people to view or download. I have a spare computer I can use as a dedicated server for these files. I'm looking for a method of, or piece of software for, hosting my files where I can assign tags or keywords to the files, and people viewing my files online can search the collection via the tags. By way of approximate solutions I've found so far, I see that there is software such as Collectorz.com or Readerware for creating databases of one's books / music / movies, and these databases can be searched by tags or keywords, and the databases can be made available and searchable online; this would suit my purposes except that my files are not necessarily books, music, or movies, and I want the files themselves accessible online, not a database describing my files. A commercially-available solution like the ones above would be acceptable, but I'd prefer to have the whole setup under my control (i.e. I'd like to either implement it by hand, or use commercial software that doesn't rely on using the company's servers, paying them a continued fee, etc.). The current extent of my internet experience is designing a few Google Sites, so I know there's a fair chance I won't understand the answers I receive, but I'm always happy to have a summer project :)

    Read the article

  • Access a PLESK website before propagation?

    - by RCNeil
    My web host uses Plesk and I want to know if there is anyway to access and view a website (with PHP and other processes being functional) without propagation of the domain name? I have found countless forums on this but they are all pretty old (circa 01-04) and involve either tricking your localhost or SSH commands and some even result in terrible security risks. I would like to access a web page directory through a browser and see it's contents while having the PHP processes carry out... before I propagate it's potential domain name. People claim this is pointless but during a site migration why on earth would you not test a site before propagating it? I'm looking for something similar to what cPanel offers i.e. http://IP.ADDRESS./~mydomain.com The only solution I could think of is storing the site in a new directory of an already functional site and then setting up databases and testing the site once it's complete. Once tested and working I should be easily be able to migrate the files to the "new" domain name's root directory and just setup a new databases and then propagate the domain name. I can't believe that Plesk V10+ still does not have a site preview method that includes PHP, JS, and Flash ability.

    Read the article

  • View the Time & Date in Chrome When Hiding Your Taskbar

    - by Asian Angel
    Do you prefer keeping your Taskbar hidden but still need to keep watch on what time it is? Now you can keep track of the time without the Taskbar using the Date Today extension for Google Chrome. A Look at Date Today with Different Themes This extension does one thing and does it well…it provides you with an “active icon” clock that will let you view the time and date in two fashions. The first is by hovering your mouse over the “Toolbar Clock Button”… And the second is by clicking on the “Toolbar Clock Button” to view an enlarged version. Here you can see the extension in use with five different themes to get an idea of how it might look with the theme that you are currently using. It does stand out very nicely with brighter or darker colored themes. Conclusion While this extension is obviously not for everyone it will make a nice (and useful) addition to Chrome for those who prefer keeping their Taskbar hidden. Links Download the Date Today extension (Google Chrome Extensions) Similar Articles Productive Geek Tips Set the Date and Time on SolarisView Browser History Based on Host & Date in ChromeQuick Tip: Set a Future Date for a Post in WordPressFuture Date a Post in Windows Live WriterSave Screen Space by Hiding the Bookmarks Toolbar in Safari for Windows TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional Discovery Channel LIFE Theme (Win7) Increase the size of Taskbar Previews (Win 7) Scan your PC for nasties with Panda ActiveScan CleanMem – Memory Cleaner AceStock – The Personal Stock Monitor Add Multiple Tabs to Office Programs

    Read the article

  • Packing up files on my machine, sending it to a server, and unpacking it

    - by MxyL
    I am implementing a feature in my application that sends all files in a specified folder to a server. I have the basic FTP transaction set up using Apache Commons FTPClient: it sets up a connection and transfers a file from one place to another. So I can simply loop over the directory and use this connection to transfer all the files. However, this could be better. Rather than transferring each file one by one, it makes more sense to pack it up in a compressed archive and then send the whole file at once. Saves time and bandwidth, since these are just text files so they compress nicely. So I would like to add automatic archive packing and unpacking. This is the workflow I have planned out, using zip compression: Zip all files in the folder Send the file over Unzip the files at its destination 1 and 2 are easy since the files are on the local machine, but I'm not sure how to accomplish the last step, when the files are now on a remote server. What are my options? I have control over what I can put and run on the server. Perhaps it is not necessary to do the packing/unpacking myself?

    Read the article

  • What should be the path for storing Maildir e-mails?

    - by Thufir
    Am I storing e-mails to the correct path? Working from the dovecot-postfix package I'm able to deliver e-mails to myself as so: thufir@dur:~$ thufir@dur:~$ telnet localhost 25 Trying 127.0.0.1... Connected to localhost. Escape character is '^]'. 220 dur.bounceme.net ESMTP Postfix (Ubuntu) HELO me 250 dur.bounceme.net mail from:<[email protected]> 250 2.1.0 Ok rcpt to:<thufir@localhost> 250 2.1.5 Ok data 354 End data with <CR><LF>.<CR><LF> subject: to evolution mail we'll see if this goes through. . 250 2.0.0 Ok: queued as 43D6F2A07C1 quit 221 2.0.0 Bye Connection closed by foreign host. thufir@dur:~$ and then here's the message: thufir@dur:~$ ll Maildir/new/ total 20 drwx------ 2 thufir thufir 4096 Nov 16 18:56 ./ drwx------ 5 thufir thufir 4096 Nov 16 18:56 ../ -rw------- 1 thufir thufir 410 Nov 16 11:57 1353095866.M305477P3932.dur,S=410,W=422 -rw------- 1 thufir thufir 424 Nov 16 17:20 1353115248.M841336P2990.dur,S=424,W=436 -rw------- 1 thufir thufir 445 Nov 16 18:56 1353121003.M187706P3838.dur,S=445,W=457 thufir@dur:~$ thufir@dur:~$ nl Maildir/new/1353121003.M187706P3838.dur\,S\=445\,W\=457 1 Return-Path: <[email protected]> 2 X-Original-To: thufir@localhost 3 Delivered-To: thufir@localhost 4 Received: from me (localhost [127.0.0.1]) 5 by dur.bounceme.net (Postfix) with SMTP id 43D6F2A07C1 6 for <thufir@localhost>; Fri, 16 Nov 2012 18:55:55 -0800 (PST) 7 subject: to evolution mail 8 Message-Id: <[email protected]> 9 Date: Fri, 16 Nov 2012 18:55:55 -0800 (PST) 10 From: [email protected] 11 we'll see if this goes through. thufir@dur:~$ Do I perhaps have postfix misconfigured? I ask because evolution seems to use a different path for mail.

    Read the article

  • Could crosslinking using very general anchor texts be a reason for a drop in rankings?

    - by webmasters
    I have crosslinked 20 sites and I thought I have been penalized for this, asked this question and some experienced members told me maybe that crosslinking may not necessarily be the reason. The sites are on same host, different C class IP and every site in linked to each other. Each site targets long tail kewords. Site 1 - BMW Used Cars - and my area Site 2 - WW Used Cars - and my area And so on... When I crosslinked them (in the sidebar), I did it for the users; instead of repeating the terms used cars and my location over and over (since my users are targeted) I just crosslinked them using the brand: BMW, WW. Targeting locally, my niches are not overly competitive, so I did not need to many external links to rank on various positions on the 1st page. I'm thinking that when I chose to link using only the brand, google might have thought I wanted to actually rank for BBW and WW, hence the drop in my targeted local traffic. Could this be? I now have no-followed the links and I am noticing a slight recovery, but if it's not a interlinking penalty it would be a shame not to benefit from my links.

    Read the article

  • Scripted SOA Diagnostic Dumps for PS6 (11.1.1.7)

    - by ShawnBailey
    When you upgrade to SOA Suite PS6 (11.1.1.7) you acquire a new set of Diagnostic Dumps in addition to what was available in PS5. With more than a dozen to choose from and not wanting to run them one at a time, this blog post provides a sample script to collect them all quickly and hopefully easily. There are several ways that this collection could be scripted and this is just one example. What is Included: wlst.properties: Ant Properties build.xml soa_diagnostic_script.py: Python Script What is Collected: 5 contextual thread dumps at 5 second intervals Diagnostic log entries from the server WLS Image which includes the domain configuration and WLS runtime data Most of the SOA Diagnostic Dumps including those for BPEL runtime, Adapters and composite information from MDS Instructions: Download the package and extract it to a location of your choosing Update the properties file 'wlst.properties' to match your environment Run 'ant' (must be on the path) Collect the zip package containing the files (by default it will be in the script.output location) Properties Reference: oracle_common.common.bin: Location of oracle_common/common/bin script.home: Location where you extracted the script and supporting files script.output: Location where you want the collections written username: User name for server connection pwd: Password to connect to the server url: T3 URL for server connection, '<host>:<port>' dump_interval: Interval in seconds between thread dumps log_interval: Duration in minutes that you want to go back for diagnostic log information Script Package

    Read the article

  • How do I dissuade users from using the same password with similar systems?

    - by Resorath
    I'm building a web application that connects to other web services (using strictly anonymous binding, so no user passwords are being used). However the web application maintains its own users itself, and is required to ask certain details such as e-mail addresses and public linking information to these other web services (for example, a username but not a password). I want to deter or prevent users from reusing passwords in my application that they have also used in the applications I'm linking to. For example, if I ask for their e-mail and provide me with their gmail address, I don't want them using their gmail password for my system. Another example would be reusing a password to a linked system in which they also gave me their username. One idea I had was to simply try using the information they gave me, along with the password they are trying to store and log in to these external web applications to test the password - then immediately unbind if I was successful and ask the user to use a different password. However I suspect there is a host of morale and legal issues there. The reason this is a big deal to me is accountability. My application is simply not funded enough to invest properly in security around user passwords. A salted, hashed password in a public SQL-like database is as secure as it gets. So if passwords and linked usernames or e-mails get out, I don't want my userbase compromised.

    Read the article

  • How to switch off? [closed]

    - by Xophmeister
    While I've programmed software for many years, I've only recently started doing so professionally and have noticed a bit of a problematic pattern. I hope this is the best place to pose such a question, as I am interested in others' experiences and solutions... Writing software is, by its nature, a cerebral exercise. When coding for my own sake, I would do so until I was satisfied; even if that meant going all night. Now I'm coding in exchange for goods and services, on projects that are inherently uninteresting to me, I want to 'switch off' when it's time to go home. Maybe you consider that to be a 'bad attitude', but I just don't feel that whatever I'm working on is worth caring about after-hours. Besides, my employer doesn't exactly have the infrastructure required to make out-of-office changes; I can't just clone a repo and even remote login is a PITA. Anyway, the problem I'm experiencing is that, while I'm not particularly overworked or stressed, if I'm faced with a problem, my brain will work on a solution. Generally, it won't give up. Hence I can't switch off and, sometimes, the problem or the solution is significant enough that it disrupts my sleep. While, paradoxically, this doesn't seem to affect my coding ability, it can have a profound impact of the rest of my life. I get increasingly low as I get tired. So far, the best solutions I've found are writing little notes on the matter (and, say, e-mailing them back to my work address) and exercise. Neither of these can switch me off entirely and, as the week progresses, exercise especially becomes untenable due to tiredness. TL;DR How can you stop from being a coding zombie?

    Read the article

  • NMap 6.01

    - by TATWORTH
    NMap 6.01 has been released at http://nmap.org/download.html"Nmap ("Network Mapper") is a free and open source (license) utility for network discovery and security auditing. Many systems and network administrators also find it useful for tasks such as network inventory, managing service upgrade schedules, and monitoring host or service uptime. Nmap uses raw IP packets in novel ways to determine what hosts are available on the network, what services (application name and version) those hosts are offering, what operating systems (and OS versions) they are running, what type of packet filters/firewalls are in use, and dozens of other characteristics. It was designed to rapidly scan large networks, but works fine against single hosts. Nmap runs on all major computer operating systems, and official binary packages are available for Linux, Windows, and Mac OS X. In addition to the classic command-line Nmap executable, the Nmap suite includes an advanced GUI and results viewer (Zenmap), a flexible data transfer, redirection, and debugging tool (Ncat), a utility for comparing scan results (Ndiff), and a packet generation and response analysis tool (Nping)."Home page is at http://nmap.org/  Nmap is free to download and use. You can download the source and compile it yourself if you so require.

    Read the article

  • Sweden Windows Azure Group Meeting in November &amp; Fast with Windows Azure Competition

    - by Alan Smith
    SWAG November Meeting There will be a Sweden Windows Azure Group (SWAG) meeting in Stockholm on Monday 19th November. Chris Klug will be presenting a session on Windows Azure Mobile Services, and I will be presenting a session on Web Site Authentication with Social Identity Providers. Active Solution have been kid enough to host the event, and will be providing food and refreshments. The registration link is here: http://swag14.eventbrite.com If you would like to join SWAG the link is here: http://swagmembership.eventbrite.com Fast with Windows Azure Competition I’ve entered a 3 minute video of rendering a 3D animation using 256 Windows Azure worker roles in the “Fast with Windows Azure” competition. It’s the last week of voting this week, it would be great if you can check out the video and vote for it if you like it. I have not driven a car for about 15 years, so if I win you can expect a hilarious summery of the track day in Vegas. My preparation for the day would be to play Project Gotham Racing for a weekend, and watch a lot of Top Gear.   My video is “Rapid Massive On-Demand Scalability Makes Me Fast!”. The link is here: http://www.meetwindowsazure.com/fast/

    Read the article

  • Should I sell video tutorials on my own or via publishers like lynda.com? [closed]

    - by Derfder
    I am asking this because I am deciding between two models right now. One way is to create video tutorials on my own (make some short free videos and long pay per download/stream videos) or sell them to lynda.com or tutsplus. The 2nd way is easier, because they will do all the boring business stuff, will host the files to download etc. In that case, everything I need is a good microphone and obey their guidelines. On the other side if I do it on my own, I have to do all the unwanted business stuff, pay the server and other stuff. This is quite a big downside, however, I will have all the videos under my control in the future. I know that lynda.com has bigger attention and marketing that I am capable, but if you take e.g. phpvideotutrials.com (r.i.p ;), I think Leigh was very successful with relatively small budget. The interesting question will be the cost or how much will they pay me. Would it be less than if I sell it myself+monthly server hosting+other expenses? Any advice from people who actively sell their videos to some companies or do it on they own is highly appreciated.

    Read the article

  • Have Windows Automatically Login Without Entering Your Password

    - by deadlydog
    If you are like me and don't want to have to enter your password each time Windows loads, you can have Windows start up without prompting you to enter a user name or password.  The simple (and BAD) way to do this is to simply not have a password on your user account, but that’s a big security risk and will allow people to easily remote desktop into your computer. So, first set a password on your windows account if you don’t already have one.  Then select Run... from the start menu (or use Windows Key + R to open the Run window) and type control userpasswords2, which will open the user accounts application.   On the Users tab, clear the box for Users must enter a user name and password to use this computer, and click on OK. An Automatically Log On dialog box will appear; enter the user name and password for the account you want to use to automatically log into Windows.  That's it.    You may also want to make sure your screen saver is not set to prompt you for a password when it exits either. Now your computer is secure without getting in your way.

    Read the article

  • Software for a online collaborative bi/tri lingual dictionary [closed]

    - by user537488
    I am looking for a software which I can host in popular and general shared web hosting services(online softwares like wordpress, meidawiki, drupal etc.) which can do the following- allow users to create account allow users or anons to add words to the dictionary (there will be English as base language and other languages) easier way to import all the words from English dictionary users should be able to write the that language equivalent of the English word Every word should have it's own address and page like www.namesomething.com/word/en/software will contain the word software and the other language word for it search should be faster and should find nearer results it's should be able to list related words like if the user is looking at "software" then other words from s like "softcopy" etc should appear alphabetically in that page Any one should be able to comment on the word which is not seen in the main page but other page similar to the talk page in the wiki any one should be able to contribute clean interface unlike wiki (media wiki and all other) just for words only I tried media wiki and other wiki software but it overloaded and unclean. I am looking for interface similar to oed.com but clean, minimal as we are not going to have such more information. Just words in English and it's other language equivalent. Here we are talking about a language which has not yet been in the Internet. It's should be collaborative.

    Read the article

  • Error: kernel headers not found. (But they are in place)

    - by Guandalino
    I'm trying to install the Guest Additions in VirtualBox 4.04. Host OS is Ubuntu desktop 11.04 64bit, guest OS is Ubuntu server 11.10 64bit. $ sudo ./VBoxLinuxAdditions.run After some output this line is printed: The headers for the current running kernel were not found. But the headers are installed, at least accordingly to dpkg: $ dpkg --get-selections | grep linux-headers linux-headers-3.0.0-12 install linux-headers-3.0.0-12-server install linux-headers-server install The running kernel is: $ uname -a Linux foobar 3.0.0-12-server #20-Ubuntu SMP Fri Oct 7 16:36:30 UTC 2011 x86_64 x86_64 X86_64 GNU/Linux How do I fix things so that Guest Additions installer is able to find kernel headers? Update: added full output. The headers for the current running kernel were not found. If the module compilation fails then this could be the reason. Building the main Guest Additions module ...done. Building the shared folder support module ...fail! (Look at /var/log/vboxadd-install.log to find out what went wrong) Installing the Window System drivers ...fails! (Could not find the X.Org or XFree86 Window System). I don't care for fail #2, because that's a server and I don't need X server. But I need shared folder support. Some further detail: $ tail /val/log/vboxadd-install.log .......... cc1: some warnings being treated as errors make[2]: *** [/tmp/vbox.0/vfsmod.o] Error 1 make[1]: *** [_module_/tmp/vbox.0] Error 2 make: *** [vboxsf] Error 2

    Read the article

  • Wired Connection Problem

    - by Dave
    After upgrading to 12.04 my interent connection no longer works. More precisely it is really, really, slow, and occasionally will connect, but do so only for a few moments and then disappear again. I am on a Lenovo Workstation e20. Output of ifconfig: eth0 Link encap:Ethernet HWaddr 70:f3:95:00:64:3e inet addr:192.168.1.20 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::72f3:95ff:fe00:643e/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:7398 errors:0 dropped:74 overruns:0 frame:0 TX packets:6684 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:5407828 (5.4 MB) TX bytes:854343 (854.3 KB) Interrupt:20 Memory:fb120000-fb140000 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:1587 errors:0 dropped:0 overruns:0 frame:0 TX packets:1587 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:152089 (152.0 KB) TX bytes:152089 (152.0 KB) I am really at a loss for what to do. I am relatively new to Ubuntu, searched the other user questions and couldn't figure this out.

    Read the article

  • send music to upnp device from PC

    - by markrich
    I have a new Arcam AirDAC (http://www.arcam.co.uk/products,rSer...ACs,airDAC.htm) attached to my stereo which has upnp support. I would like to send audio from my 14.04 PC to the box itself and to this end installed Rygel upon my system to help but it hasn't. I have created a new sound device in PulseAudioPreferences and selected it from inidcator-sound-switcher but here I become stuck. The sound is heading to the new sound device as the volume can be seen to go up and down from PulseAudioVolumeControl but no sound comes from the stereo downstairs. The problem, as I see it, is the new device has no idea where to send the music as the Arcam hasn't been chosen from any program. So - I installed BubbleUPNP and Plex. My music has been imported into the later and the former can see both the Arcam as a Renderer and the Plex as the Media Server. Installing the BubbleUPNP program on my Android tablet allowed me to send music and all seemed good UNTIL I started playing AIFF and ALAC music and it all stopped. No suitable decoding device. So that scuppered that route. So here I am and stuck. How can I tell Ubuntu to use the Arcam as a renderer to play music through when the albums are played from Rhytmnbox, Tomahawk, Clementine or other? Clementine would be my preferred client as there is a usable remote control program for the tablet. Can anyone help me fix this or advice another way to do what I would like?

    Read the article

  • CSOM (Client Side Object Model) - What's new with SharePoint 2013

    - by KunaalKapoor
    SharePoint CSOMThe Client-Side Object Model or CSOM came out with SharePoint 2010. CSOM is accessible through client.svc but all client.svc calls must go through supported WFC entry points (supported entry points are .NET, Silverlight and JavaScript). So a developer would need to use client side proxy objects exposed by either a .NET assembly or a JavaScript library. Changes with SharePoint 2013REST Capabilities - Direct access to client.svcNew APIs - App ModelREST CapabilitiesOne of the most important changes to the CSOM with SharePoint 2013 is that the web service entry point of client.svc has been extended to allow direct access  via REST-Based web service calls. This is a really critical change since its going to make the SharePoint platform accessible to any other platform, opening the horizons of integration and collaboration with other REST based platforms and devices. OData (a really popular standard data access API for HTTP-based clients) is supported similar to 2010 but will be a more important aspect of SharePoint 2013 development.New API'sCSOM for SharePoint 2013 has been buffed up with several new APIs for not only SharePoint server functionality but also an API for Windows Phone applications. For a SharePoint 2010 farm most of the new APIs mentioned below are available only via server side APIs:SearchTaxonomyPublishingWorkflowUser ProfilesE-DiscoveryAnalyticsBusiness DataIRMFeedsSharePoint 2013 remote APIs being accessible through both CSOM and REST is very important to the new app model where developers can no longer run code in a SharePoint environment nor can they access the server-side APIs. So CSOM plays the savior here.Also, you can now substitute the alias '_api' in order to reference '_vti_bin/client.svc'.

    Read the article

  • Hosting and scaling a Facebook application in the cloud? [migrated]

    - by DhruvPathak
    We would be building a Facebook application in Django (Python), but still not sure of where to host it economically, and with a good provision to scale in case the app gets viral. Some details about the app: Would be HTML based like a website,using django as a framework. 100K is the number of expected pageviews in a day, if the app is viral. The users will not generate any media content, only some database data will be generated by them. It would be great if someone with more experience can guide on following points: A) Hosting on Google app engine or Amazon EC2 or some other cloud like RackSpace : Preferable points found in AppEngine were ease of deployment, cost effectiveness and easy scaling. For EC2: Full hold of the virtual machine,Amazon NoSQL and RDMBS database services in case we decide to use them. B) Does backend technology affect monthly cost? eg. would CPU and memory usage difference of Django over , for example , PHP framework like CodeIgnitor really make remarkable difference in running costs. (Here is the article that triggered this thought process : http://journal.dedasys.com/2010/01/12/rough-estimates-of-the-dollar-cost-of-scaling-web-platforms-part-i#comments) C) Does something like Heroku , which provides additional services over Amazon EC2, prove to be better than raw cloud management? It is not that we are trying for premature scaling, we just want to have a good start so that we are ready to handle unpredicted growth and scale.

    Read the article

  • Minimum percentage of free physical memory that Linux require for optimal performance

    - by csoto
    Recently, we have been getting questions about this percentage of free physical memory that OS require for optimal performance, mainly applicable to physical compute nodes. Under normal conditions you may see that at the nodes without any application running the OS take (for example) between 24 and 25 GB of memory. The Linux system reports the free memory in a different way, and most of those 25gbs (of the example) are available for user processes. IE: Mem: 99191652k total, 23785732k used, 75405920k free, 173320k buffers The MOS Doc Id. 233753.1 - "Analyzing Data Provided by '/proc/meminfo'" - explains it (section 4 - "Final Remarks"): Free Memory and Used Memory Estimating the resource usage, especially the memory consumption of processes is by far more complicated than it looks like at a first glance. The philosophy is an unused resource is a wasted resource.The kernel therefore will use as much RAM as it can to cache information from your local and remote filesystems/disks. This builds up over time as reads and writes are done on the system trying to keep the data stored in RAM as relevant as possible to the processes that have been running on your system. If there is free RAM available, more caching will be performed and thus more memory 'consumed'. However this doesn't really count as resource usage, since this cached memory is available in case some other process needs it. The cache is reclaimed, not at the time of process exit (you might start up another process soon that needs the same data), but upon demand. That said, focusing more specifically on the percentage question, apart from this memory that OS takes, how much should be the minimum free memory that must be available every node so that they operate normally? The answer is: As a rule of thumb 80% memory utilization is a good threshold, anything bigger than that should be investigated and remedied.

    Read the article

  • VirtualBox 4.0.10 is now available for download

    - by user12611829
    VirtualBox 4.0.10 has been released and is now available for download. You can get binaries for Windows, OS X (Intel Mac), Linux and Solaris hosts at http://www.virtualbox.org/wiki/Downloads The full changelog can be found here. The high points for the 4.0.10 maintenance release include .... GUI: fixed disappearing settings widgets on KDE hosts (bug #6809) Storage: fixed hang under rare circumstances with flat VMDK images Storage: a saved VM could not be restored under certain circumstances after the host kernel was updated Storage: refuse to create a medium with an invalid variant Snapshots: none of the hard disk attachments must be attached to another VM in normal mode when creating a snapshot USB: fixed occasional VM hangs with SMP guests USB: proper device detection on RHEL/OEL/CentOS 5 guests ACPI: force the ACPI timer to return monotonic values for improve behavior with SMP Linux guests RDP: fixed screen corruption under rare circumstances rdesktop-vrdp: updated to version 1.7.0 OVF: under rare circumstances some data at the end of a VMDK file was not written during export Mac OS X hosts: Lion fixes Mac OS X hosts: GNOME 3 fix Linux hosts: fixed VT-x detection on Linux 3.0 hosts Linux hosts: fixed Python 2.7 bindings in the universal Linux binaries Windows hosts: fixed leak of thread and process handles Windows Additions: fixed bug when determining the extended version of the Guest Additions Solaris Additions: fixed installation to 64-bit Solaris 10u9 guests Linux Additions: RHEL6.1/OL6.1 compile fix Linux Additions: fixed a memory leak during VBoxManage guestcontrol execute Technocrati Tags: Sun Virtualization VirtualBox var sc_project=1193495; var sc_invisible=1; var sc_security="a46f6831";

    Read the article

< Previous Page | 558 559 560 561 562 563 564 565 566 567 568 569  | Next Page >