Search Results

Search found 20569 results on 823 pages for 'pc settings'.

Page 231/823 | < Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >

  • Battery is drained too quickly

    - by LucaB
    I'm getting really low battery life under ubuntu, not even close to windows. I tried powertop, and I saw that my laptop is consuming in idle nearly 20 watts (a bit more). I tried to install laptop-mode-tools, change "good" into "bad" in powertop, but nothing changes. I see that I have the the HD audio output device which is running at 100% every time. Could this be the problem? This is a report from powertop. The battery reports a discharge rate of 22.8 W The estimated remaining time is 33 minutes Summary: 381.8 wakeups/second, 0.0 GPU ops/second and 0.0 VFS ops/sec Usage Events/s Category Description 3.2 ms/s 182.7 Timer tick_sched_timer 100.0% Device Audio codec hwC0D3: Intel 7.9 ms/s 25.1 Process /usr/bin/X :0 -auth /var/run/lightdm/root/:0 -nolisten tcp vt7 -novtswitch -background no 1.9 ms/s 24.2 Interrupt [6] tasklet(softirq) 2.9 ms/s 23.2 Process /usr/lib/chromium-browser/chromium-browser --type=zygote 8.1 ms/s 20.3 Process /usr/lib/unity/unity-panel-service 0.7 ms/s 17.4 Timer hrtimer_wakeup 4.2 ms/s 12.6 Process unity-2d-panel 604.4 µs/s 9.7 Process syndaemon -i 2.0 -K -R -t 149.7 µs/s 9.7 kWork ieee80211_iface_work 0.8 ms/s 8.7 Process metacity 19.5 ms/s 1.0 Process powertop 3.0 ms/s 6.8 Process //bin/dbus-daemon --fork --print-pid 5 --print-address 7 --session 699.0 µs/s 6.8 Process /usr/lib/thunderbird/thunderbird 4.3 ms/s 4.8 Process gnome-terminal 658.9 µs/s 2.9 Interrupt [1] timer(softirq) 75.1 µs/s 2.9 kWork iwl_bg_run_time_calib_work 163.8 µs/s 1.9 Process /usr/lib/accountsservice/accounts-daemon 70.6 µs/s 1.9 Process [ksoftirqd/2] 25.8 µs/s 1.9 Process [ksoftirqd/0] 1.0 ms/s 1.0 Process /usr/bin/python /usr/sbin/powernapd 408.2 µs/s 1.0 Process unity-2d-shell 189.8 µs/s 1.0 Process /usr/lib/chromium-browser/chromium-browser 124.4 µs/s 1.0 Process /usr/lib/unity-lens-applications/unity-applications-daemon 113.3 µs/s 1.0 Process /usr/lib/gnome-settings-daemon/gnome-settings-daemon 112.0 µs/s 1.0 Process nautilus -n 104.9 µs/s 1.0 Process /usr/lib/gvfs/gvfsd-trash --spawner :1.2 /org/gtk/gvfs/exec_spaw/0 77.5 µs/s 1.0 Process /usr/lib/x86_64-linux-gnu/colord/colord 75.6 µs/s 1.0 Process /usr/lib/gvfs/gvfs-gdu-volume-monitor 75.0 µs/s 1.0 Interrupt [53] i915 74.9 µs/s 1.0 Process /usr/lib/gvfs/gvfs-afc-volume-monitor What should I do to make the battery consumption lower?

    Read the article

  • wine 1.4 regedit makes screen flicker on 12.04 with dual monitor setup

    - by s1lv3r
    I have a dualmonitor setup running dual 23" on 1920x1080 which has the following problem: When running any wine application (for example "wine regedit" from console) the screen flickers and the windows have artifacts like this: Also sometimes taking a screenshot using the print key will make compiz crash (starter and all window bars/menus are gone) when an wine application is started. I don't have the same problems on my notebook which has the same setup. Only difference is the notebook has ATI Graphics and this PC has nvidia. This is the output of lshw -c video: *-display Beschreibung: VGA compatible controller Produkt: G72 [GeForce 7300 LE] Hersteller: NVIDIA Corporation Physische ID: 0 Bus-Informationen: pci@0000:07:00.0 Version: a1 Breite: 64 bits Takt: 33MHz Fähigkeiten: pm msi pciexpress vga_controller bus_master cap_list rom Konfiguration: driver=nvidia latency=0 Ressourcen: irq:16 memory:fa000000-faffffff memory:d0000000-dfffffff memory:fb000000-fbffffff memory:fce00000-fce1ffff I also noticed that running xrandr from console makes the screen flicker for some seconds on this PC, which also doesn't happen on my notebook. Removing one screen from the setup will stop the flickering and the artifacts inside the wine applications from appearing. Does anybody have an advice what I could try to change to make this work?

    Read the article

  • Why is Ubuntu offline (except torrents) while Windows is online?

    - by Fahim al Islam
    I am using a static wired connection. Everything was perfect. But suddenly from few hours back I can't access any website. Dropbox, Ubuntu One also can't connect. Ping request is also unsuccessful, but I can download through torrent. I am not trying torrent download and browsing at the same time. So, I think it's not an issue about torrent using all the bandwidth. One important point is that this connection works perfectly on Windows on this same PC (My PC is dual-boot). I have tried the way what izx has suggested (using "sudo sh -c 'echo nameserver 8.8.8.8 /etc/resolv.conf'"), but I'm facing the same problem again. Now I can't even ping 8.8.8.8 and google.com. Though I can ping 74.125.228.2 (which is Google IP address) I can't understand what's happening and why this is happening. I'm new in this website many rules and regulations is unknown to me. So, please don't be bothered for my mistakes. Looking forward for help from anyone. Thanks to all.

    Read the article

  • Chainloading GRUB2 from BURG

    - by WindowsEscapist
    I have an old PC with Puppy Linux in addition to Ubuntu and Windows XP. THis creates a LOT of menu entries (all of which I would like to keep): Ubuntu 10.04 Ubuntu Recovery Mode Memtest x86 Memtest Serial Windows XP Pro Precise Puppy Linux Precise Puppy TORAM Puppy 4.3.1 Puppy 4.3.1 TORAM Plop Boot Manager (for booting to USB, PC doesn't have BIOS option). Now, in my fancy shiny laptop I've gotten really attached to BURG, and would like a setup where I have a Windows icon, an Ubuntu icon, and an arrow that chainloads GRUB2 so that I can boot from USB or run Puppy if need be (all these entries will obviously not fit into the BURG theme I use, Lightness). The problem is, GRUB2 can't install the the beginning of a partition like it used to be able to (I am reluctant to specify andything with --force at the end), at least, without warning that warn: This is a BAD idea!. So I'm kind of at a loss here. I can't see how the folding option would work, because all of those other options would have the same icon once I unfolded (Lightness is non-text-based). If I do embed GRUB using grub-install /dev/sdax --force, how do I chainload it with BURG? Is there another way?

    Read the article

  • BizTalk 2009 - Error when Testing Map with Flat File Source Schema

    - by StuartBrierley
    I have recently been creating some flat file schemas using the BizTalk Server 2009 Flat File Schema Wizard.  I have then been mapping these flat file schemas to a "normal" xml schema format. I have not previsouly had any cause to map flat files and ran into some trouble when testing the first of these flat file maps; with an instance of the flat file as the source it threw an XSL transform error: Test Map.btm: error btm1050: XSL transform error: Unable to write output instance to the following <file:///C:\Documents and Settings\sbrierley\Local Settings\Temp\_MapData\Test Mapping\Test Map_output.xml>. Data at the root level is invalid. Line 1, position 1. Due to the complexity of the map in question I decided to created a small test map using the same source and destination schemas to see if I could pinpoint the problem.  Although the source message instance vaildated correctly against the flat file schema, when I then tested this simplified map I got the same error. After a time of fruitless head scratching and some serious google time I figured out what the problem was. Looking at the map properties I noticed that I had the test map input set to "XML" - for a flat file instance this should be set to "native".

    Read the article

  • Empty APN list on your Android phone? Restart the phone to fix it

    - by Gopinath
    Today I tried to connect to internet on my Google Galaxy Nexus running on 4.2.2 using Cellular Data connection and it failed. Tried reaching customer care representative to figure out why data connection is not working, but the robots (Interactive Voice Response systems) never allowed me to reach a human. After digging through the settings I found empty list of APN (Access Point Names) is the reason for not able to connect to internet. Not sure what caused APN list to vanish but I tried to create a new one that matches with the settings required for AT & T mobile. To my surprise I found that the newly created APN is also not shown in the APN list. Well there is something wrong with the phone – my APN’s are not shown as well as the newly created one is also not displayed. A simple Google search on this problem shown many forum discussions list and the solution to resolve the issue is to restart the phone. As soon as I restarted my phone the APN list is automatically populated and I’m able to connect to internet on my mobile.

    Read the article

  • If I install Ubuntu 12.04, will it recognize all of my RAM?

    - by user91048
    I have a question that's been bugging me since a long time. A friend of mine told me that when he had Ubuntu 11.10 installed the OS only recognized 3.4GB instead of 8GB. In the next week I'll be buying a new computer and I'll have 8GB of RAM, does the Video Card need to have it's own video memory for the OS to recognize the RAM entirely?. If you could give me some advise on how to configure my PC before I buy it it would be great. Thanks. Tengo una duda que me ronda de hace tiempo. Un amigo mio me comento que con ubuntu 11.10 tenia 8 gb de RAM y que solo le reconocia 3.4. Dentro de unos 5 dias me comprare un ordenador nuevo a base de componentes y voy a meterle 8 gb de RAM. ¿Hace falta que la tarjeta gráfica tenga Gb dedicados para que el sistema me reconozca la RAM entera? Si podeis darme algunos consejillos sobre como configurar el PC antes de comprarmelo para que ese problema no me pase, Muchisimas Gracias.

    Read the article

  • XNA Windows Resolution / Mouse Position Bug

    - by Ian Hern
    In XNA, when in windowed mode and resolution (set via PreferredBackBufferWidth/Height) is close to the resolution of the display, the view is distorted (zoomed in a bit)and the mouse coordinates are wrong. Here is what it looks like when I draw a bunch of lines to the screen. (Normal, Error on my ASUS Notebook G73Jh, Error on my EEE PC 1001P) In the top left of the screen the mouse position is correct, but the further you get away the more out of sync it becomes. Here are some images of the mouse in different positions and the game drawing a circle underneath where it thinks the mouse is. (Top Left, Bottom Right) If you shrink the resolution by a couple pixels then it goes back to working like normal, my first though at a fix was to limit the max resolution to a little smaller than the display resolution. I figured out the maximum resolution that works in a couple different modes, but there doesn't seem to be a pattern that would allow me to determine it based off the display resolution. Computer | Screen Resolution | Max Error-Free | Difference ASUS Notebook G73Jh | 1920x1080 | 1924x1059 | +4x-21 ASUS Notebook G73Jh | 1024x600 | 1018x568 | -6x-32 EEE PC 1001P | 1024x600 | 1020x574 | -4x-26 Because the differences don't form a pattern I can't hack in a solution, the one even has +4 which baffles me. Here is a project that demonstrates the problem, just set the resolution to the resolution of your display. Any ideas on how I might fix this issue? As an insteresting aside, I tried to use FRAPS to capture a video of the issue but fraps actually records without distortion or mouse offset.

    Read the article

  • Change order of monitors without changing fullscreen"size"

    - by user171489
    I have a dual monitor setup. My primary monitor is a 22" with a max resolution of 1680x1050 and my secondary is a 19" with a max resolution of 1280x1024. The secondary is standing on the left side of the primary one. My problem now is, that, if I change the order of the monitors in my nvidia x-server settings, so that my secondary is the first one (or the one on the left), the fullscreen mode in flash in scaled up to my secondary monitor, even if it´s displayed on my primary one. Meaning that i get a 1280x1024 "fullscreen" window on my bigger primary monitor. When I configure my x-server settings so the secondary monitor is the one on the right, I don´t have this problem. The only thing then is, that I have to scroll out on the right to get to my monitor on the right. I can´t move my secondary monitor on the right side of my primary due to lack of space and my belief that there must be a software solution. ;) Thanks in advance.

    Read the article

  • Collabnet Subversion and Self Signed Certificates

    - by Robert May
    We installed Collabnet as our subversion server recently.  This is the first time that we’ve used it.  In general, it seems pretty good, but we ran into a problem with it.  People were getting the following error in Tortoise: OPTIONS of ’https://xxxx.xxxxxxxx.xxxx/svn/xxxxx’: SSL handshake failed: SSL error code – 1/1/336032856 (https://xxxx.xxxxxxxx.xxxx) The odd thing is that for some people, it worked, for others, it didn’t!  I also couldn’t find anything useful out on the internet. We had checked the Subversion Server should serve via https option in the settings, and all of the ports were open, etc. This option causes a self signed certificate to be used. What we discovered: Tortoise must use the same url as is in the Hostname field on the General settings for collabnet or you’ll get this error.  Basically, some people were using https://svn.xxxxxxx.xxxxx and others were using https://computername.xxxxxxxx.xxxx.  Because the host name said used the computer name version, the whole thing broke.  By changing the host name to the svn version, which is what they should be using, the problem went away.  The users do get the “Accept Certificate” prompt, but we can live with that! Technorati Tags: Subversion,Collabnet

    Read the article

  • Delphi Client-Server Application using Firebird 2.5 error

    - by Japie Bosman
    I have got a lengthy question to ask. First of all Im still very new when it comes to Delphi programming and my experience has beem mostly developing small single user database applications using ADO and an Access database. I need to take the transition now to a client server application and this is where the problem starts. I decided to use Firebird 2.5 embeded as my database, as it is open source, and it is can be used with the interbase components in Delphi and that multiple clients can access the database simultanously. So I followed the interbase tutorial in Delphi. I managed to connect the client to the server and see the data in the example (While both are running on my pc), but when i tried to move the client to another pc, keeping the server on mine and running it to see if I can connect to the server it gave me the following error. Exception EIdSocketError in module clientDemo.exe at 0029DCAC. Socket Error # 10061 Connection refused. I understand that this might be because the host is defined as localhost in the client. But here is my first question. In the TSQLConncetion you can set die hostname under Driver-Hostname. The thing I want to know is how do you do this at run time, as I cannot get the property when I try and make an edit box to allow the user to enter the value and then set it via code like for example: SQLConncetion1.Driver.Hostname := edtHost.text; The thing is there is not such property to set, so how do you set the hostname at run time? Im using Delphi XE2 There is still a lot of questions to come especially when it comes to deployment, but I will take this piece by piece and I appreciate the advice.

    Read the article

  • Correct nvidia+intel graphics setup in 14.04

    - by Espressofa
    Just upgraded to 14.04 to try to fix some other issues. Now, something has gone wrong with my graphics. I have a Thinkpad T530 with Intel and Nvidia graphics cards. $ inxi -SGx System: Host: xyz Kernel: 3.13.0-24-generic x86_64 (64 bit, gcc: 4.8.2) Desktop: N/A Distro: Ubuntu 14.04 trusty Graphics: Card-1: Intel 3rd Gen Core processor Graphics Controller bus-ID: 00:02.0 Card-2: NVIDIA GF108M [NVS 5400M] bus-ID: 01:00.0 X.Org: 1.15.1 drivers: fbdev,vesa,intel,nouveau (unloaded: nvidia) Resolution: [email protected] GLX Renderer: N/A GLX Version: N/A Direct Rendering: N/A $ glxinfo name of display: :0 Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Error: couldn't find RGB GLX visual or fbconfig Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Error: couldn't find RGB GLX visual or fbconfig Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". Xlib: extension "GLX" missing on display ":0". I'm not sure what I did but now something is wrong with my graphics, as should be visible from the above commands. nvidia-detector says "none" as well. I used to have bumblebee but then some website said to remove it and now something's clearly wrong. What's the right way to set things up? Should I try to add bumblebee back? Here's what's installed now: $ dpkg --get-selections | grep nvidia nvidia-319 install nvidia-331 install nvidia-libopencl1-331 install nvidia-opencl-icd-331 install nvidia-prime install nvidia-settings install nvidia-settings-319 install

    Read the article

  • Simple Architecture Verification

    - by Jean Carlos Suárez Marranzini
    I just made an architecture for an application with the function of scoring, saving and loading tennis games. The architecture has 2 kinds of elements: components & layers. Components: Standalone elements that can be consumed by other components or by layers. They might also consume functionality from the model/bottom layer. Layers: Software components whose functionality rests on previous layers (except for the model layer). -Layers: -Models: Data and it's behavior. -Controllers: A layer that allows interaction between the views and the models. -Views: The presentation layer for interacting with the user. -Components: -Persistence: Makes sure the game data can be stored away for later retrieval. -Time Machine: Records changes in the game through time so it's possible to navigate the game back and forth. -Settings: Contains the settings that determine how some of the game logic will apply. -Game Engine: Contains all the game logic, which it applies to the game data to determine the path the game should take. This is an image of the architecture (I don't have enough rep to post images): http://i49.tinypic.com/35lt5a9.png The requierements which this architecture should satisfy are the following: Save & load games. Move through game history and see how the scoreboard changes as the game evolves. Tie-breaks must be properly managed. Games must be classified by hit-type. Every point can be modified. Match name and player names must be stored. Game logic must be configurable by the user. I would really appreciate any kind of advice or comments on this architecture. To see if it is well built and makes sense as a whole. I took the idea from this link. http://en.wikipedia.org/wiki/Model%E2%80%93view%E2%80%93controller

    Read the article

  • Added resolution not working after upgrading to 12.04

    - by David
    After upgrading, my screen resolution (added by xrandr comands on the start) did not work like before. Messages appear showing some errors (that i have never had in 11.10). "No se pudo aplicar la configuración almacenada para los monitores"/"Can't apply the stored configuration for the monitors." This script didn't work either. xrandr --newmode "1280x1024_60.00" 109.00 1280 1368 1496 1712 1024 1027 1034 1063 -hsync +vsync xrandr --addmode VGA1 1280x1024_60.00 xrandr --output VGA1 --mode 1280x1024_60.00 I also tryed deleting monitors.xml but, nothing. This only erase the window message. It's been sayd that ##It's a normal buggy and well know problem for Pc's with Intel integrated video cards.## The new version of gnome-settings-daemon stores its configuration information in dconf rather than gconf. I tryed something, but the problem persist. This is what i did. Install the dconf-tools package, and then run dconf-editor. In the tree on the left, navigate org - gnome - settings-daemon - plugins - xrandr. Uncheck the active checkbox. restart your XServer (Ctrl+Alt+Backspace) (It didn't worked out for me, but it may be helpful to someone)

    Read the article

  • How do I stop sound coming from my speakers with a faulty headphone socket?

    - by Andy
    On my laptop, I have a faulty headphone socket, so when I insert headphones into it, the speakers do not mute. I can confirm that this problem is caused by faulty hardware and not software as when I twist the headphone jack, the speakers come on and off according to the movements. On previous versions of Ubuntu, I worked around this problem by going into alsamixer and disabling "Auto-Mute Mode", and then going into the sound settings and choosing "Analog Headphones". However, on 12.04, no such option exists, rendering my headphones unusable with no way to work around the problem. I momentarily thought I had this problem fixed when I installed PulseAudio Volume Control from the Software Centre. I selected the Output Devices tab, and under "Built-in Audio Analogue Stereo" I selected "Headphones" for the port. However, this almost randomly seems to change back to "Speakers", despite me setting "Auto-Mute Mode" as disabled. Basically, I would like a way to permanently mute the speakers so only the headphones will play sound, without it losing my settings. It is ridiculous that such a simple setting has been taken away to "simplify" the user interface.

    Read the article

  • Obscure SPUtility.SendMail Behavior When Manually Passing in Mail Headers

    - by Damon
    There are two ways to send mail in SharePoint: you can either use the mail components from the System.Net namespace, or you can send email using SharePoint's SPUtility.SendMail method.  One of the benefits of the SPUtility.SendMail method is that it uses the mail configuration from SharePoint, so you can manage settings in Central Administration instead of having to go through and modify your web.config file.  SPUtility.SendMail can get the job done, but it's defiantly not as developer friendly as the components from the System.Net namespace.  If you want to CC someone on an email, for example, you do NOT have a nice CC parameter - you have to manually add the CC mail header and pass it into the SPUtility.SendMail method.  I had to do this the other day, and ran into a really obscure issue. If you do NOT pass the headers into the method then SharePoint sends the email using the From Address configured in the Outgoing Mail settings in Central Admin.  If you pass headers into the method, but do not include the from header, then SharePoint sends the mail using the email address of the current user. This can be an issue if your mail server is setup to reject an email from an invalid email address or an email address that is not on your domain.  The way to fix this issue is to always pass in the from header.  If you want to use the configured From address, then you can do the following: SPWebApplication webApp = SPWebApplication.Lookup(new Uri(SPContext.Current.Site.Url)); StringDictionary headers = new StringDictionary(); headers.Add("from", webApp.OutboundMailSenderAddress);

    Read the article

  • How do I disable touchpad tap to click?

    - by AWE
    You've heard this a million times but the "tap to click" is a pain in the behind and I want to disable it. There is no touchpad in gpointing-device-settings and neither in mouse and touchpad in system settings. I've tried some commands in terminal but it's all crap. Dconf-editor doesn't react. How about solving this once and for all? xinput list: ? Virtual core pointer id=2 [master pointer (3)] ? ? Virtual core XTEST pointer id=4 [slave pointer (2)] ? ? PS/2 Generic Mouse id=13 [slave pointer (2)] ? Virtual core keyboard id=3 [master keyboard (2)] ? Virtual core XTEST keyboard id=5 [slave keyboard (3)] ? Power Button id=6 [slave keyboard (3)] ? Video Bus id=7 [slave keyboard (3)] ? Video Bus id=8 [slave keyboard (3)] ? Power Button id=9 [slave keyboard (3)] ? Sleep Button id=10 [slave keyboard (3)] ? Laptop_Integrated_Webcam_HD id=11 [slave keyboard (3)] ? AT Translated Set 2 keyboard id=12 [slave keyboard (3)] ? Dell WMI hotkeys id=14 [slave keyboard (3)]

    Read the article

  • How do I mount my Android phone?

    - by Amanda
    I'm puzzled because my phone used to just appear when I plugged it in. It doesn't anymore and The development options are definitely set to allow USB debugging. The phone is charging via USB but doesn't appear in lsusb [0 amanda@luna android-sdk-linux_86]$ lsusb Bus 001 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 002 Device 001: ID 1d6b:0002 Linux Foundation 2.0 root hub Bus 003 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 004 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 005 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 006 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 007 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 008 Device 001: ID 1d6b:0001 Linux Foundation 1.1 root hub Bus 001 Device 004: ID 17ef:4807 Lenovo UVC Camera Bus 003 Device 012: ID 413c:1003 Dell Computer Corp. Keyboard Hub Bus 003 Device 003: ID 08ff:2810 AuthenTec, Inc. AES2810 Bus 003 Device 013: ID 413c:2010 Dell Computer Corp. Keyboard Bus 003 Device 014: ID 046d:c001 Logitech, Inc. N48/M-BB48 [FirstMouse Plus] adb devices -l shows nothing. In my Wireless and Network settings I changed the USB connection settings to "Mass storage" -- they were set to "Ask on connection" though I definitely wasn't getting asked. I don't get any Click here to connect via USB alert either. I'm not even sure whether the issue is my phone or my computer. It seems odd that it isn't even appearing in lsusb Not for nothing, the thumb drive on my keyring also does not appear in lsusb -- I've tried both in a bunch of different ports. I kind of assume the thumb drive is just borked, but it could be my OS.

    Read the article

  • How should I structure my database to gain maximum efficiently in this scenario?

    - by Bob Jansen
    I'm developing a PHP script that analyzes the web traffic of my clients websites. By placing a link to a javascript on the clients website (think of Google Analyses), my script harvests information like: the visitors IP address, reference link, current page link, user agent, etc. Now my clients can view these statistics via a control panel that I have build. These clients can also adjust profile settings, set firewall rules, create support tickets and pay invoices. Currently all the the traffic is stored in one table. You can imagine that this tabel would become very large as some my clients receive thousands of pageviews per day. Furthermore, all the traffic data of each client would be stored in the same table, creating a mess. This is the same for the firewall rules currently, and the invoice and support system. I'm looking for way to structure my database in a more organized way to hold large amounts of data of multiple users. This is the first project that I'm developing that deals with so much data, and would like to hear suggestions and tips. I was thinking of using multiple databases to structure the data. The main database will store users data (email,pass,id,etc) admin/website settings. Than each client will have an unique database labeled prefix_userid, which carry tables holding their traffic, invoice, and support ticket data. Would this be a solution, and would it slow down or speed up overall performances (that is spreading the data over muliple databases). I have a solid VPS, but would like to safe and be as effient as possible.

    Read the article

  • Frequent disconnects using wlan AR9285

    - by John Neil
    I'm getting a large number of disconnects from my wireless when I switched to oneiric server (I did not see these happen with oneiric desktop) from my AR9285 wireless LAN device. Here is the syslog snippet: Oct 17 09:43:17 weather kernel: [ 1537.329138] wlan0: deauthenticated from 00:12:17:7a:8e:42 (Reason: 7) Oct 17 09:43:17 weather kernel: [ 1537.340409] cfg80211: All devices are disconnected, going to restore regulatory settings Oct 17 09:43:17 weather kernel: [ 1537.340423] cfg80211: Restoring regulatory settings Oct 17 09:43:17 weather kernel: [ 1537.340435] cfg80211: Calling CRDA to update world regulatory domain Oct 17 09:43:17 weather kernel: [ 1537.348571] cfg80211: Ignoring regulatory request Set by core since the driver uses its own custom regulatory domain Oct 17 09:43:17 weather kernel: [ 1537.348581] cfg80211: World regulatory domain updated: Oct 17 09:43:17 weather kernel: [ 1537.348586] cfg80211: (start_freq - end_freq @ bandwidth), (max_antenna_gain, max_eirp) Oct 17 09:43:17 weather kernel: [ 1537.348594] cfg80211: (2402000 KHz - 2472000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348600] cfg80211: (2457000 KHz - 2482000 KHz @ 20000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348607] cfg80211: (2474000 KHz - 2494000 KHz @ 20000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348613] cfg80211: (5170000 KHz - 5250000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Oct 17 09:43:17 weather kernel: [ 1537.348620] cfg80211: (5735000 KHz - 5835000 KHz @ 40000 KHz), (300 mBi, 2000 mBm) Here is the relevant lspci output: # lspci | grep Atheros 02:00.0 Network controller: Atheros Communications Inc. AR9285 Wireless Network Adapter (PCI-Express) (rev 01) I have done quite a bit of searching and saw discussions for previous versions of ubuntu that recommended installing the linux-backports-modules package. However, this does not appear to be available for oneiric (just the headers are listed as a package). Any advice on how to achieve a stable wireless connection for this server? It's location mitigates against using a wired connection.

    Read the article

  • Game: Age of Empires sound good but video "out of range"

    - by Ezekiel
    I'm new to the Ubuntu realm. Currently i'm using Linux Mint 12 with Wine 1.4 and PLAYONLINUX as game loading/playing programs. Video card is MSI GeForce FX5200 (NVIDIA) and is 3d enabled. I can play "Call of Duty 5 demo just fine. My real love is the Age of Empires series games. I loaded the WINE version of AOE 1 demo. No sound and no picture. Black screen with "Out of Range" window in red. I loaded my CD version of AOE 1 through PLAYONLINUX. I get the sound just fine but again the black screen with "Out of Range" window in red. I have used all the monitor settings in both the "settings" and in winecfg. None of the eight monitor options worked in any combination. I have checked all the questions and blogs on this error and tried all I found and no one seems to come up with a real fix. I guess I need to know exactly what the "Out of Range" means. Any help? Anywhere? Thanks

    Read the article

  • Adventures in Lab Management Configuration: Part 3 of 3

    - by Enrique Lima
    This is long overdue.  But here it is. In the previous two sections I have discussed on how I got a CMMI v4.2 to take on the same fields as v5 and therefore be able to communicate with MTM and Lab Manager.  And that was quite a success. Yet when I opened up Lab Management while it was fully aware of the VMs being there, it refused to let me enroll them into an environment.  It kept stating there was no suitable host to deploy the VM to, error TF259115. This was an indication something was not matching the expected network configuration between TFS and Hyper-V/SCVMM. So, here are a couple of things that took place: Verified the network segment specified for network isolation matched what was configured physically for either DHCP or manually assigned IP addressing for the guest VMs Made sure TFS was fully aware of the configuration settings for the network location name.  For that I issued:  tfsconfig lab /settings /networklocation:”<name of the network location configured in SCVMM” On that last item, that was key to making sure Lab Management communicated with the VMs and for it to allow enrollment into the new Virtual Environment.

    Read the article

  • Dual boot on Hp Envy Ultrabook

    - by phodu_insaan
    I just bought a HP Envy ultrabook 1002TX. It comes with a Win7 Home basic and a 32GB SSD + 500GB HDD. I started to install ubuntu and in doing so went and deleted all the partitions on my HDD and recreated them the way I wanted. Then when I tried to install ubuntu it didn't recognize my HDD. To solve this i typed dmraid -E -r /dev/sdX where the 'X' was my SSD drive. After this ubuntu can install but windows for some reason does not install. Also the Intel Caching feature is lost and SSD is just sitting and doing nothing. I want to know how to solve this problem. Ideeally I would like to use the SSD for caching, either in windows or ubuntu. How do I get the SSD back to working as an Intel rapid cache? How do I get windows to install properly? It tell me that windows is unable to configure itself to my hardware, and my PC came with windows pre-installed so this is not possible. Sorry for the long question and thanks for your answers! PS: At one time when I booted I pressed Ctrl+i and went to the intel rapid cache menu. I think i screwed up something in here, because only after this the rapid cache stopped working, and each time I booted the PC thought the BIOS was my primary disk.

    Read the article

  • Installing BURG (NO MBR!)

    - by StepTNT
    I'd like to install BURG but I don't want it to overwrite my MBR. That's because I've got two bootloader in my system : 1) Default Windows 7 bl in my MBR 2) Grub in /dev/sda2 With the first power button, it will boot as Grub is not installed, so it simply loads Windows, with the second button it boots from /dev/sda2, loading Grub and then Kubuntu. Now I want to replae Grub with Burg. I've installe Burg with burg-install /dev/sda2 --force because I don't want my MBR to be overwritten (pressing the first power button MUST load Windows and avoid showing any sign of Linux things). Setup was completed without errors, and everything seems to work fine : if I open burg-manager it loads my settings, allowing me to change them and to test everything with burg-emu (that works as I want to). So, everything seems fine, Burg's installed, I've changed my settings but.. If I use the second power button, it still loads Grub! (Even a different version from the default Kubuntu one). So, here's the question : How can I replace GRUB that's on /dev/sda2 with BURG and WITHOUT writing it to my MBR? Thanks :)

    Read the article

  • Ninject/DI: How to correctly pass initialisation data to injected type at runtime

    - by MrLane
    I have the following two classes: public class StoreService : IStoreService { private IEmailService _emailService; public StoreService(IEmailService emailService) { _emailService = emailService; } } public class EmailService : IEmailService { } Using Ninject I can set up bindings no problem to get it to inject a concrete implementation of IEmailService into the StoreService constructor. StoreService is actually injected into the code behind of an ASP.NET WebForm as so: [Ninject.Inject] public IStoreService StoreService { get; set; } But now I need to change EmailService to accept an object that contains SMTP related settings (that are pulled from the ApplicationSettings of the Web.config). So I changed EmailService to now look like this: public class EmailService : IEmailService { private SMTPSettings _smtpSettings; public void SetSMTPSettings(SMTPSettings smtpSettings) { _smtpSettings = smtpSettings; } } Setting SMTPSettings in this way also requires it to be passed into StoreService (via another public method). This has to be done in the Page_Load method in the WebForms code behind (I only have access to the Settings class in the UI layer). With manual/poor mans DI I could pass SMTPSettings directly into the constructor of EmailService and then inject EmailService into the StoreService constructor. With Ninject I don't have access to the instances of injected types outside of the objects they are injected to, so I have to set their data AFTER Ninject has already injected them via a separate public setter method. This to me seems wrong. How should I really be solving this scenario?

    Read the article

< Previous Page | 227 228 229 230 231 232 233 234 235 236 237 238  | Next Page >