Search Results

Search found 15665 results on 627 pages for 'inno setup'.

Page 231/627 | < Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >

  • Installing drivers for an ATI 6950

    - by aj.esler
    Is there any way to get ubuntu to recognise an ATI 6950? I have tried installing the proprietary ATI drivers, but these do not seem to work properly. It seems to screw up the gnome desktop, and starts booting into terminal. I'm somewhat of a linux newbie, so not really sure whats going wrong. I don't want to game on ubuntu, I just want it to recognise my dual monitor setup rather than mirroring the monitors, so any driver that does this is fine by me. It seems like the open source ATI radeon drivers will not support the 6950/6970 for some time, at least until kernel 2.6.38 is released. The card is also flashed with the 6970 firmware, not sure if this is important. I am running a 64 bit version of ubuntu, I installed ubuntu server 10.10, then installed gnome-desktop.

    Read the article

  • Can I use Ubuntu as a wireless media server which performs all decoding/processing server-side?

    - by AthloX
    I want to setup UBUNTU 12.04 desktop as Home media server. I have window 7 netbook and UBUNTU 12.04lts laptop even a samsun galaxy note tablet (android). Two desktop in other room with dualboot win7 and ubuntu. SHARP AQUOS Plasma Tv with Wi-Fi connected. I want to install ubuntu as media server to stream audio/video files over wi-fi. Not only this i want this media server to use its own processing power to decode ans stream so that on remote end only file can play without using their own resource. Is it possible to use ubuntu as media server to stream files without making the remote end to use there own resource. I want only bandwidth of Wi-Fi to be use in this and media center hardware resource.Remote end gadget should use only speaker and screen and not processing power of their own. Please any suggestion is it possible to do so ?

    Read the article

  • JPRT: A Build & Test System

    - by kto
    DRAFT A while back I did a little blogging on a system called JPRT, the hardware used and a summary on my java.net weblog. This is an update on the JPRT system. JPRT ("JDK Putback Reliablity Testing", but ignore what the letters stand for, I change what they mean every day, just to annoy people :\^) is a build and test system for the JDK, or any source base that has been configured for JPRT. As I mentioned in the above blog, JPRT is a major modification to a system called PRT that the HotSpot VM development team has been using for many years, very successfully I might add. Keeping the source base always buildable and reliable is the first step in the 12 steps of dealing with your product quality... or was the 12 steps from Alcoholics Anonymous... oh well, anyway, it's the first of many steps. ;\^) Internally when we make changes to any part of the JDK, there are certain procedures we are required to perform prior to any putback or commit of the changes. The procedures often vary from team to team, depending on many factors, such as whether native code is changed, or if the change could impact other areas of the JDK. But a common requirement is a verification that the source base with the changes (and merged with the very latest source base) will build on many of not all 8 platforms, and a full 'from scratch' build, not an incremental build, which can hide full build problems. The testing needed varies, depending on what has been changed. Anyone that was worked on a project where multiple engineers or groups are submitting changes to a shared source base knows how disruptive a 'bad commit' can be on everyone. How many times have you heard: "So And So made a bunch of changes and now I can't build!". But multiply the number of platforms by 8, and make all the platforms old and antiquated OS versions with bizarre system setup requirements and you have a pretty complicated situation (see http://download.java.net/jdk6/docs/build/README-builds.html). We don't tolerate bad commits, but our enforcement is somewhat lacking, usually it's an 'after the fact' correction. Luckily the Source Code Management system we use (another antique called TeamWare) allows for a tree of repositories and 'bad commits' are usually isolated to a small team. Punishment to date has been pretty drastic, the Queen of Hearts in 'Alice in Wonderland' said 'Off With Their Heads', well trust me, you don't want to be the engineer doing a 'bad commit' to the JDK. With JPRT, hopefully this will become a thing of the past, not that we have had many 'bad commits' to the master source base, in general the teams doing the integrations know how important their jobs are and they rarely make 'bad commits'. So for these JDK integrators, maybe what JPRT does is keep them from chewing their finger nails at night. ;\^) Over the years each of the teams have accumulated sets of machines they use for building, or they use some of the shared machines available to all of us. But the hunt for build machines is just part of the job, or has been. And although the issues with consistency of the build machines hasn't been a horrible problem, often you never know if the Solaris build machine you are using has all the right patches, or if the Linux machine has the right service pack, or if the Windows machine has it's latest updates. Hopefully the JPRT system can solve this problem. When we ship the binary JDK bits, it is SO very important that the build machines are correct, and we know how difficult it is to get them setup. Sure, if you need to debug a JDK problem that only shows up on Windows XP or Solaris 9, you'll still need to hunt down a machine, but not as a regular everyday occurance. I'm a big fan of a regular nightly build and test system, constantly verifying that a source base builds and tests out. There are many examples of automated build/tests, some that trigger on any change to the source base, some that just run every night. Some provide a protection gateway to the 'golden' source base which only gets changes that the nightly process has verified are good. The JPRT (and PRT) system is meant to guard the source base before anything is sent to it, guarding all source bases from the evil developer, well maybe 'evil' isn't the right word, I haven't met many 'evil' developers, more like 'error prone' developers. ;\^) Humm, come to think about it, I may be one from time to time. :\^{ But the point is that by spreading the build up over a set of machines, and getting the turnaround down to under an hour, it becomes realistic to completely build on all platforms and test it, on every putback. We have the technology, we can build and rebuild and rebuild, and it will be better than it was before, ha ha... Anybody remember the Six Million Dollar Man? Man, I gotta get out more often.. Anyway, now the nightly build and test can become a 'fetch the latest JPRT build bits' and start extensive testing (the testing not done by JPRT, or the platforms not tested by JPRT). Is it Open Source? No, not yet. Would you like to be? Let me know. Or is it more important that you have the ability to use such a system for JDK changes? So enough blabbering on about this JPRT system, tell me what you think. And let me know if you want to hear more about it or not. Stay tuned for the next episode, same Bloody Bat time, same Bloody Bat channel. ;\^) -kto

    Read the article

  • Unity and games don't work on new Thinkpad T420

    - by Clay Smalley
    Here's my setup: Lenovo ThinkPad T420, brand new NVIDIA Graphics Card 4GB of Ram 128GB Solid State Drive Intel Core i5 Processor Given these specs, there's no reason games and Unity shouldn't be working. The strange thing is that both do work when I run from a live USB, but not when Ubuntu is installed to the hard drive. Is there something different with the 3D capabilities of running from the computer as opposed to running from the live USB? Edit: Some more information: When I log in for the first time when running from the hard drive, Ubuntu says "It seems that you do not have the hardware required to run Unity. Please choose Ubuntu Classic at the login screen and you will be using the traditional environment."

    Read the article

  • Cloud to On-Premise Connectivity Patterns

    - by Rajesh Raheja
    Do you have a requirement to convert an Opportunity in Salesforce.com to an Order/Quote in Oracle E-Business Suite? Or maybe you want the creation of an Oracle RightNow Incident to trigger an on-premise Oracle E-Business Suite Service Request creation for RMA and Field Scheduling? If so, read on. In a previous blog post, I discussed integrating TO cloud applications, however the use cases above are the reverse i.e. receiving data FROM cloud applications (SaaS) TO on-premise applications/databases that sit behind a firewall. Oracle SOA Suite is assumed to be on-premise with with Oracle Service Bus as the mediation and virtualization layer. The main considerations for the patterns are are security i.e. shielding enterprise resources; and scalability i.e. minimizing firewall latency. Let me use an analogy to help visualize the patterns: the on-premise system is your home - with your most valuable possessions - and the SaaS app is your favorite on-line store which regularly ships (inbound calls) various types of parcels/items (message types/service operations). You need the items at home (on-premise) but want to safe guard against misguided elements of society (internet threats) who may masquerade as postal workers and vandalize property (denial of service?). Let's look at the patterns. Pattern: Pull from Cloud The on-premise system polls from the SaaS apps and picks up the message instead of having it delivered. This may be done using Oracle RightNow Object Query Language or SOAP APIs. This is particularly suited for certain integration approaches wherein messages are trickling in, can be centralized and batched e.g. retrieving event notifications on an hourly schedule from the Oracle Messaging Service. To compare this pattern with the home analogy, you are avoiding any deliveries to your home and instead go to the post office/UPS/Fedex store to pick up your parcel. Every time. Pros: On-premise assets not exposed to the Internet, firewall issues avoided by only initiating outbound connections Cons: Polling mechanisms may affect performance, may not satisfy near real-time requirements Pattern: Open Firewall Ports The on-premise system exposes the web services that needs to be invoked by the cloud application. This requires opening up firewall ports, routing calls to the appropriate internal services behind the firewall. Fusion Applications uses this pattern, and auto-provisions the services on the various virtual hosts to secure the topology. This works well for service integration, but may not suffice for large volume data integration. Using the home analogy, you have now decided to receive parcels instead of going to the post office every time. A door mail slot cut out allows the postman can drop small parcels, but there is still concern about cutting new holes for larger packages. Pros: optimal pattern for near real-time needs, simpler administration once the service is provisioned Cons: Needs firewall ports to be opened up for new services, may not suffice for batch integration requiring direct database access Pattern: Virtual Private Networking The on-premise network is "extended" to the cloud (or an intermediary on-demand / managed service offering) using Virtual Private Networking (VPN) so that messages are delivered to the on-premise system in a trusted channel. Using the home analogy, you entrust a set of keys with a neighbor or property manager who receives the packages, and then drops it inside your home. Pros: Individual firewall ports don't need to be opened, more suited for high scalability needs, can support large volume data integration, easier management of one connection vs a multitude of open ports Cons: VPN setup, specific hardware support, requires cloud provider to support virtual private computing Pattern: Reverse Proxy / API Gateway The on-premise system uses a reverse proxy "API gateway" software on the DMZ to receive messages. The reverse proxy can be implemented using various mechanisms e.g. Oracle API Gateway provides firewall and proxy services along with comprehensive security, auditing, throttling benefits. If a firewall already exists, then Oracle Service Bus or Oracle HTTP Server virtual hosts can provide reverse proxy implementations on the DMZ. Custom built implementations are also possible if specific functionality (such as message store-n-forward) is needed. In the home analogy, this pattern sits in between cutting mail slots and handing over keys. Instead, you install (and maintain) a mailbox in your home premises outside your door. The post office delivers the parcels in your mailbox, from where you can securely retrieve it. Pros: Very secure, very flexible Cons: Introduces a new software component, needs DMZ deployment and management Pattern: On-Premise Agent (Tunneling) A light weight "agent" software sits behind the firewall and initiates the communication with the cloud, thereby avoiding firewall issues. It then maintains a bi-directional connection either with pull or push based approaches using (or abusing, depending on your viewpoint) the HTTP protocol. Programming protocols such as Comet, WebSockets, HTTP CONNECT, HTTP SSH Tunneling etc. are possible implementation options. In the home analogy, a resident receives the parcel from the postal worker by opening the door, however you still take precautions with chain locks and package inspections. Pros: Light weight software, IT doesn't need to setup anything Cons: May bypass critical firewall checks e.g. virus scans, separate software download, proliferation of non-IT managed software Conclusion The patterns above are some of the most commonly encountered ones for cloud to on-premise integration. Selecting the right pattern for your project involves looking at your scalability needs, security restrictions, sync vs asynchronous implementation, near real-time vs batch expectations, cloud provider capabilities, budget, and more. In some cases, the basic "Pull from Cloud" may be acceptable, whereas in others, an extensive VPN topology may be well justified. For more details on the Oracle cloud integration strategy, download this white paper.

    Read the article

  • Cloud to On-Premise Connectivity Patterns

    - by Rajesh Raheja
    Do you have a requirement to convert an Opportunity in Salesforce.com to an Order/Quote in Oracle E-Business Suite? Or maybe you want the creation of an Oracle RightNow Incident to trigger an on-premise Oracle E-Business Suite Service Request creation for RMA and Field Scheduling? If so, read on. In a previous blog post, I discussed integrating TO cloud applications, however the use cases above are the reverse i.e. receiving data FROM cloud applications (SaaS) TO on-premise applications/databases that sit behind a firewall. Oracle SOA Suite is assumed to be on-premise with with Oracle Service Bus as the mediation and virtualization layer. The main considerations for the patterns are are security i.e. shielding enterprise resources; and scalability i.e. minimizing firewall latency. Let me use an analogy to help visualize the patterns: the on-premise system is your home - with your most valuable possessions - and the SaaS app is your favorite on-line store which regularly ships (inbound calls) various types of parcels/items (message types/service operations). You need the items at home (on-premise) but want to safe guard against misguided elements of society (internet threats) who may masquerade as postal workers and vandalize property (denial of service?). Let's look at the patterns. Pattern: Pull from Cloud The on-premise system polls from the SaaS apps and picks up the message instead of having it delivered. This may be done using Oracle RightNow Object Query Language or SOAP APIs. This is particularly suited for certain integration approaches wherein messages are trickling in, can be centralized and batched e.g. retrieving event notifications on an hourly schedule from the Oracle Messaging Service. To compare this pattern with the home analogy, you are avoiding any deliveries to your home and instead go to the post office/UPS/Fedex store to pick up your parcel. Every time. Pros: On-premise assets not exposed to the Internet, firewall issues avoided by only initiating outbound connections Cons: Polling mechanisms may affect performance, may not satisfy near real-time requirements Pattern: Open Firewall Ports The on-premise system exposes the web services that needs to be invoked by the cloud application. This requires opening up firewall ports, routing calls to the appropriate internal services behind the firewall. Fusion Applications uses this pattern, and auto-provisions the services on the various virtual hosts to secure the topology. This works well for service integration, but may not suffice for large volume data integration. Using the home analogy, you have now decided to receive parcels instead of going to the post office every time. A door mail slot cut out allows the postman can drop small parcels, but there is still concern about cutting new holes for larger packages. Pros: optimal pattern for near real-time needs, simpler administration once the service is provisioned Cons: Needs firewall ports to be opened up for new services, may not suffice for batch integration requiring direct database access Pattern: Virtual Private Networking The on-premise network is "extended" to the cloud (or an intermediary on-demand / managed service offering) using Virtual Private Networking (VPN) so that messages are delivered to the on-premise system in a trusted channel. Using the home analogy, you entrust a set of keys with a neighbor or property manager who receives the packages, and then drops it inside your home. Pros: Individual firewall ports don't need to be opened, more suited for high scalability needs, can support large volume data integration, easier management of one connection vs a multitude of open ports Cons: VPN setup, specific hardware support, requires cloud provider to support virtual private computing Pattern: Reverse Proxy / API Gateway The on-premise system uses a reverse proxy "API gateway" software on the DMZ to receive messages. The reverse proxy can be implemented using various mechanisms e.g. Oracle API Gateway provides firewall and proxy services along with comprehensive security, auditing, throttling benefits. If a firewall already exists, then Oracle Service Bus or Oracle HTTP Server virtual hosts can provide reverse proxy implementations on the DMZ. Custom built implementations are also possible if specific functionality (such as message store-n-forward) is needed. In the home analogy, this pattern sits in between cutting mail slots and handing over keys. Instead, you install (and maintain) a mailbox in your home premises outside your door. The post office delivers the parcels in your mailbox, from where you can securely retrieve it. Pros: Very secure, very flexible Cons: Introduces a new software component, needs DMZ deployment and management Pattern: On-Premise Agent (Tunneling) A light weight "agent" software sits behind the firewall and initiates the communication with the cloud, thereby avoiding firewall issues. It then maintains a bi-directional connection either with pull or push based approaches using (or abusing, depending on your viewpoint) the HTTP protocol. Programming protocols such as Comet, WebSockets, HTTP CONNECT, HTTP SSH Tunneling etc. are possible implementation options. In the home analogy, a resident receives the parcel from the postal worker by opening the door, however you still take precautions with chain locks and package inspections. Pros: Light weight software, IT doesn't need to setup anything Cons: May bypass critical firewall checks e.g. virus scans, separate software download, proliferation of non-IT managed software Conclusion The patterns above are some of the most commonly encountered ones for cloud to on-premise integration. Selecting the right pattern for your project involves looking at your scalability needs, security restrictions, sync vs asynchronous implementation, near real-time vs batch expectations, cloud provider capabilities, budget, and more. In some cases, the basic "Pull from Cloud" may be acceptable, whereas in others, an extensive VPN topology may be well justified. For more details on the Oracle cloud integration strategy, download this white paper.

    Read the article

  • My First robots.txt

    - by Whitechapel
    I'm creating my first robots.txt and wanted to get a second opinion on it. Basically I have a FTP setup on my board for some special users to transfer files between each other and I do NOT want that included in the search by the bots. I also want to point to my sitemap which gets auto generated by a PHP page. So here is what I have, what else should I include, and if I need to fix anything with it? Also, it's linking to xmlsitemap.php because that generates the sitemap when called. My goal is to allow any search bot crawl the forums to grab meta data. User-agent: * Disallow: /admin/ Disallow: /ali/ Disallow: /benny/ Disallow: /cgi-bin/ Disallow: /ders/ Disallow: /empire/ Disallow: /komodo_117/ Disallow: /xanxan/ Disallow: /zeroordie/ Disallow: /tmp/ Sitemap: http://www.vivalanation.com/forums/xmlsitemap.php Edit, I'm not sure how to handle all the user's folders under /public_html/ since the robots.txt will be going in /public_html.

    Read the article

  • Computer Lab School for Orphans

    - by Brendon
    I am helping out an NGO, called Orphans Found Fund, here in Arusha Tanzania setup a computer lab to teach students about Ubuntu and open source applications. I have installed Ubuntu 10.10 on all the systems. What I'm wondering about is how to tweak the systems so that the kids cannot: Delete or alter system files Alter the system settings Add or remove applications Exceed a time limit (like an Internet Cafe) Also as the administrator I would like to monitor the usage for another system to make sure that abuse of network is not taking place. Any advice is much appreciated. Brendon

    Read the article

  • Issue with setting up multiple IP addresses on ubuntu server installation

    - by varunyellina
    I want to setup two ip addresses on my system for access through lan. This is my config on my other system. Desktop Installation My desktop installation runs with multiple IP's added through networkmanager both through lan and wifi. Server Installation On my server install I've edited /etc/network/interfaces to the following. auto eth0 auto eth0:1 # IP-1 iface eth0 inet static address 172.16.35.35 network 172.16.34.1 netmask 255.255.254.0 broadcast 172.166.35.255 dns-nameservers 172.16.100.221 8.8.8.8 # IP-2 iface eth0:1 inet static address 172.16.34.34 network 172.16.34.1 netmask 255.255.254.0 gateway 172.16.34.1 broadcast 172.16.35.255 After restarting through "/etc/init.d/networking restart" I recieve "Failed to bring up eth0:1" What am I doing wrong? Thankyou.

    Read the article

  • How can the maximum number of simultaneous users to log in to Ubuntu server be increased?

    - by nixnotwin
    I use ubuntu server 10.04 on a fairly good machine, with 2.40 duel-core processor and 2GB RAM. My users login with ssh or samba. I have setup LDAP with PAM to sync user accounts between unix and samba. When I allowed about 90 users to login over ssh at once the server refused login for many users. I am using dropbear as ssh server. Even samba logins failed for many users. I need to allow at least 100 users to login at once. Is there anyway to do this?

    Read the article

  • The Underlying Value of Aspect-Oriented Programming

    - by Brian
    Hello, I recently got into PostSharp, an AOP tool for weaving in code. I've been finding a lot of resistance with other developers over giving up writing code to perform the tasks the weaving was meant to simplify. For instance, I'm finding logging or error-handling code where I have postsharp already doing that. I can understand why its happening, since its hard to remember everything that weaving was setup to do (I'm applying a global attribute definition). With that said, factoring in levels of experience, etc, is AOP beneficial to a project? What is your opinion? Thanks.

    Read the article

  • Is it safe to have no TOS or PP?

    - by JamerTheProgrammer
    I have coded my own forums from the ground up. I have tried my best to make my code as secure as possible and encrypting everything I can. I want to use this forum for a Minecraft server. I have one concern however.... I would like to setup this forum now but having no TOS or Privacy Policy has put me off. Will having none of either cause me any legal trouble in the unlikely event of a data leakage? Thanks

    Read the article

  • SQL Server Express Profiler

    - by David Turner
    During a recent project, while waiting for our Development Database to be provisioned on the clients corporate SQL Server Environment (these things can sometimes take weeks or months to be setup), we began our initial development against a local instance on SQL Server Express, just as an interim measure until the Development database was live.  This was going just fine, until we found that we needed to do some profiling to understand a problem we were having with the performance of our ORM generated Data Access Layer.  The full version of SQL Server Management Studio includes a profiler, that we could use to help with this kind of problem, however the Express version does not, so I was really pleased to find that there is a freely available Profiler for SQL Server Express imaginatively titled ‘SQL Server Express Profiler’, and it worked great for us.  http://sites.google.com/site/sqlprofiler/

    Read the article

  • Cannot boot Win7 after updating to ubuntu 11.10

    - by angryInsomniac
    I had a dual boot setup on my system, with ubuntu 11.04 and Win7. Yesterday I updated ubuntu to 11.10. Now , I cannot seem to be able to load Win7 , I do see an entry for it on the boot menu. I select that entry and then the screen goes black , with everything coming to a standstill. I tried to wait for it to respond but nothing happened. I went through the forums and tried "grub-update" but when I do that , nothing really happens. I am not sure of this , but I get no output from grub for upto 10 minutes ( can grub-update be made verbose ) How can I fix this and get back to booting both OS'es properly ?

    Read the article

  • Which programming language suits a system that must work without user input

    - by Ruud
    I'm building a prototype of a device that will function much alike a digital photoframe. It will display images retrieved from the internet. The device must start up and run the photoframe. It will have no user interface. The device has a minimal ubuntu installation, but I could install Xorg or whatever needed. Question: I have trouble figuring out which programming language will be suitable. I've just started using Python to try out several things and I am able to download and display images. I guess that means Python can do what I'd like, but is it suitable as a language that will be run on boot without any user interference? Related questions: - How do I set up Linux to start that script automatically? - How to setup a second Python script as a server that runs in the background to retrieve images before they are displayed (Because I think I'll need threading of some sort?)

    Read the article

  • Find Nearest Object

    - by ultifinitus
    I have a fairly sizable game engine created, and I'm adding some needed features, such as this, how do I find the nearest object from a list of points? In this case, I could simply use the Pythagorean theorem to find the distance, and check the results. I know I can't simply add x and y, because that's the distance to the object, if you only took right angle turns. However I'm wondering if there's something else I could do? I also have a collision system, where essentially I turn objects into smaller objects on a smaller grid, kind of like a minimap, and only if objects exist in the same gridspace do I check for collisions, I could do the same thing, only make the gridspace larger to check for closeness. (rather than checking every. single. object) however that would take additional setup in my base class and clutter up the already cluttered object. TL;DR Question: Is there something efficient and accurate that I can use to detect which object is closest, based on a list of points and sizes?

    Read the article

  • How much to charge for Wordpress installation?

    - by Jack Duluoz
    I know this isn't properly a technical question but I hope this is ok here. The question is simple: how much should I charge a customer for a Wordpress installation & configuration? Configuration simply means I have to install him a theme (which is not provided by me), various plugins and maybe edit some lines of code here and there to make the whole thing work fine. MORE INFO I don't do this for a living, I'm just doing this for this single customer. He told me he wants to customize some features of the blog which I think will require a bit of code editing, but these will be small modifications, because I already told him that more substantial modifications will be billed separately. I don't know exactly how long will this take, but probably just 1 day for the setup and some more days to adapt the blog to the customer requests which will eventually come up later

    Read the article

  • Both Webmin & ISPConfig won't work on my Ubuntu Linode

    - by SERVE-U
    I followed the steps on this page to download and setup Webmin on my Linode Ubuntu server, however, when I try to visit https + my.ip.add.ress + :10000, the page just hangs and nothing loads. I already looked into my firewall settings. I uninstalled Webmin and installed ISPConfig and all its dependencies, and the same thing happens at https + my.ip.add.ress + :8080. This is my first time managing a server so there could be something I overlooked. But my server is a pretty vanilla Ubuntu 12.10 with LAMP stack installed exactly as per the instructions in Linode's documentation.

    Read the article

  • XEN 4.1 missing from the Grub Menu

    - by Sid
    I installed Xen with following Commands. apt-get install xen-hypervisor-4.1-i386 apt-get install xen-utils-4.1 apt-get install xenwatch apt-get install xen-tools apt-get install xen-utils-common apt-get install xenstore-utils apt-get install virtinst apt-get install virt-viewer apt-get install virt-manager as given in http://www.beyondlinux.com/2011/11/02/install-xen-4-1-and-setup-your-cloud-os-on-ubuntu-11-10/ But I cant see any entry for Xen 4.1 in my GRUB Menu upon reboot. (As soon as I install above listed packages) Any solution? Please Help

    Read the article

  • Will moving to Facebook/Disqus Commenting lighten the load on my server any?

    - by sublet
    I manage a site that gets about 50 million hits a month. It's a Wordpress site, load balanced over 6 servers, and has a varnish caching system setup. Right now, 95 - 97% of the time, page views hit the cache. The only time it serves up a new page from the server is when a new story is created, or someone is logged in looking at the stories and commenting. What I am trying to figure out is that if I move over to Facebook Comments or Disqus commenting, and get rid of the users entirely, will that lighten the load? I would think it would because the only time you would be hitting the server, and not the cache, is when you're logged in - which only the admins would be. I know it's only 2.5 - 3% but I wasn't 100% sure.

    Read the article

  • Thinktecture IdentityServer Azure Edition RC

    - by Your DisplayName here!
    I found some time over the holidays to finalize the Azure edition of IdentityServer. http://identityserver.codeplex.com/releases/view/81206 The biggest difference to the on-premise version (and earlier Azure betas) is, that by default IdSrv now uses Azure Storage for all data storage (configuration & user data). This means that there is no need anymore for SQL Azure (which is still supported out of the box – just not the default anymore). The download includes a readme file with setup instructions. In a nutshell: Create a new hosted service and upload your certificates Modify the service configuration file in the download to your needs (signing cert, connection strings to storage…) Deploy the package via the portal or other tools Use the new Powershell scripts to add users If you encounter any problem, please give me feedback.

    Read the article

  • Links shortener with advanced reporting?

    - by Qualcuno
    I am serching for a script (preferably in PHP) or an external solution which lets me create an "url shortener" with advanced reports. We have been using Google Short Links for a while: it works really well, but it lacks reporting (it only displays a counter with the total number of redirects). Our setup is as follows: "go.mydomain.com" points to the web service, and we can create links such as "go.mydomain.com/product1". What I'm looking for is a similar service (or self-hosted solution) but with advanced reports, so we can track redirects by day, month, etc, distinguish between mobile and desktop users (very important!) and so on.

    Read the article

  • Sun & Moon Movement

    - by Thomas Mosey
    I'm creating a 2D HTML5 Canvas Game and am stuck on how to go about animating my Sun & Moon. The current setup is basically setting the moon at -1024 on the X-axis and the sun at 0 and animating them at 1 pixel a second. My canvas width is 1024 pixels and whenever the sun/moons X position crosses over the width of the canvas, it's X position is then set to -1024 to repeat the animation. What I am trying to do is get it to sync up with my day/night cycles. Each day is 10000 ticks long (A tick being added every frame) with Day/Night being 50% each (5000 ticks each). What I am trying to calculate is what I'll need to add to the X position of each per frame to get the sun from an X of 0 to 1024 after 5000 ticks/frames. Any help is appreciated.

    Read the article

  • How to distribute applications?

    - by Dr Deo
    I am new to Ubuntu development. As a learning experience, I have written a custom chat application using qt4 and I want to deploy it in some sort of setup file. Whats the easiest way of deploying an application viz a viz setting desktop icons. automatically requesting for administrator privileges to execute. inserting an entry into the startup menu. automatically compress my application and reduce download size. automatic startup for my application without user intervention I am familiar with using NSIS scripts on Windows, but I don't know where to begin on Ubuntu. I would preffer a solution similar to NSIS scripts.

    Read the article

  • Why is my Quickly app full of fail?

    - by bstpierre
    I tried to use quickly on Ubuntu 12.04 to create an application, but it does not behave as described in that linked page. I don't get a popup when creating the application (see error below). % quickly create ubuntu-application foo Creating project directory foo Creating bzr repository and committing Launching your newly created project! (foo:16847): GLib-GIO-ERROR **: Settings schema 'org.gnome.desktop.interface' is not installed Congrats, your new project is setup! cd /tmp/foo/ to start hacking. It creates a project, but when I try to run, it crashes and burns: % cd foo % quickly run (foo:22639): GLib-GIO-ERROR **: Settings schema 'org.gnome.desktop.interface' is not installed Is this because I'm not using gnome-shell? What can I do to get a working project? (Edit: As a side note, I'd be willing to debug this myself, but I don't even get a traceback. What do I have to do to get quickly to give me a traceback?)

    Read the article

< Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >