Search Results

Search found 2128 results on 86 pages for 'suzy fresh'.

Page 49/86 | < Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >

  • Why does Ubuntu gets stuck on the loading screen?

    - by mohit
    I've been experiencing many problem lately since I fresh installed Ubuntu 12.04 LTS on my SONY VAIO VPCEH with Windows 7 previously installed. Sometimes when I try to boot Ubuntu, it gets stuck at loading screen. There seems to be some problem with driver (as far I can judge). Following is the log generated, when I press Esc during the boot (before the problem occurs): ... * Stopping System V initialization compatibility [ok] * Starting System V runlevel compatibility [ok] * Starting crash report submission daemon [ok] * Starting automatic crash report generation [ok] ... ... * Starting LightDM Display Manager [ok] Nothing works after that, no Esc, etc, except restart. Also I've observed the following: Inactivity of Hard-drive (Led doesn't glows). Flashing, or blinking, of Caps-lock and Scroll-lock On restart, Ubuntu seem to load successfully. However, the loading screen has somewhat basic graphics. This problem started after I installed Additional drivers: NVIDIA accelerated graphics driver Also, most of the times Ubuntu loads without any problem. However, it is annoying to restart everytime it fails. So my question is: Why this happens and what is the solution?

    Read the article

  • Animating DOM elements vs refreshing a single Canvas

    - by mgibsonbr
    A few years ago, when the HTML Canvas element was still kinda fresh, I wrote a small game in a rather "unusual" way: each game element had its own canvas, and frequently animated elements even had multiple canvases, one for each animation sprite. This way, the translation would be done by manipulating the DOM position of the canvases, while the sprite animation would consist of altering the visibility of the already drawn canvases. (z-indexes, of course, were the tricky part) It worked like a charm: even in IE6 with excanvas it showed a decent performance, and everything was rather consistent between browsers, including some smartphones. Now I'm thinking in writing a larger game engine in the same fashion, so I'm wondering whether it would be a good idea to do so in the current context (with all the advances in browsers and so on). I know I'm trading memory for time, so this needs to be customizable (even at runtime) for each machine the game will be running. But I believe using separate canvases would also help to avoid the game "freezing" on CPU spikes, since the translation would still happen even if the redraws lag for a while. Besides, the browsers' rendering engines are already optimized in may ways, so I'm guessing this scheme would also reduce the load on the CPU (in contrast to doing everything in JavaScript - specially the less optimized ones). It looks good in my head, but I'd like to hear the opinion of more experienced people before proceeding further. Is there any known drawback of doing this? I'm particulartly unexperienced in dealing with the GPU, so I wonder whether this "trick" would nullify any benefit of using a single, big canvas. Or maybe on modern devices it's overkill (though I'm skeptic about the claims that canvas+js - especially WebGL - will ever be a good alternative to native code). Any thoughts?

    Read the article

  • Ubuntu 13.10 - Black screen after login session

    - by AlexKibo88
    Yesterday, the system asked me if I wanted to upgrade to 13.10 from 13.04. Since that was the first time for me, I decided to proceed and ran the installation without worrying about it. After the system completed the upgrade, my PC was rebooted and the new version of Ubuntu signalled that the desktop would have run in low graphigs mode due to a graphical driver problem. I couldn't manage to fix the issue, so I continued with the low graphics mode... but nothing showed up. I wasn't even able to access one of the TTYs. So I ended up rebooting the system and accessing the recovery mode. I uninstalled all my ATI graphics drivers along with xorg-server and fglrx. After rebooting, I was able to access the login session of Ubuntu, but after confirming my credentials, the desktop wouldn't show up, but instead a black screen appeared with the following message: System program problem detected I couldn't figure out the problem and tried to restore/re-install all the graphics elements I knew. but nothing worked. The screen still remains black and won't show the icons nor the Unity bar. What would you advise me to do? Should I try launching a fresh install of the OS from the live CD? Thank you.

    Read the article

  • How do I fix a garbled screen on a Gateway LT3103u?

    - by paracaudex
    I've been having garbled screen problems on a Gateway LT3103u on Ubuntu for a while. I just did a fresh install of Ubuntu 11.10 and continue to have issues. I installed xubuntu-desktop in case the issues had to do with the sophisticated GNOME graphics. The problem is less bad, but it's still there. After a few minutes of using XFCE, the screen gets garbled. I assume this has something to do with the graphics card, but I don't know how to go about troubleshooting something like this. Where should I start? Update: Here is the description of the VGA card from lspci -vvv: 01:05.0 VGA compatible controller: ATI Technologies Inc RS690M [Radeon X1200 Series] (prog-if 00 [VGA controller]) Subsystem: Acer Incorporated [ALI] Device 028c Control: I/O+ Mem+ BusMaster+ SpecCycle- MemWINV- VGASnoop- ParErr- Stepping- SERR- FastB2B- DisINTx- Status: Cap+ 66MHz- UDF- FastB2B- ParErr- DEVSEL=fast TAbort- SERR- [disabled] Capabilities: [50] Power Management version 2 Flags: PMEClk- DSI- D1+ D2+ AuxCurrent=0mA PME(D0-,D1-,D2-,D3hot-,D3cold-) Status: D0 NoSoftRst- PME-Enable- DSel=0 DScale=0 PME- Capabilities: [80] MSI: Enable- Count=1/1 Maskable- 64bit+ Address: 0000000000000000 Data: 0000 Kernel driver in use: radeon Kernel modules: radeon Update: Setting GRUB_CMDLINE_LINUX="nomodeset" in /etc/default/grub seems to have fixed it in both Ubuntu and xubuntu-desktop. I will test it for a day or so to see if the problems recur and then post more detail with some links to an explanation. Update 2: It is possible to use this fix for Nvidia card (GTX 260) when graphics is defective after 11.10 upgrade/install? First few restarts was graphic ok, then after few restarts begins suddenly be defective and it stay so. I must returned to 11.04 because this problem and I wait for 12.04. So I hope in this fix.

    Read the article

  • Ubuntu 11.10 and Mobility radeon HD 4570, 512MB can't find working drivers

    - by Slavak
    i'm pretty new to Linux and my new problem is ATI drivers. When i installed Ubuntu then i had black screen issue with the blinking in left upper corner, fixed it with F6 and set the "nolapic" mode, can boot only with "nolapic". Now the problem is the drivers. The suggested drivers, from the "Additional drivers" are not working. Always freeze at the login screen, the divers are: ATI/AMD proprietary FGLRX graphics driver Tried this method: http://drivers.downloadatoz.com/tutorial/28786,how-to-fix-amd-catalyst-11-10-not-working-on-ubuntu-11-10-issues.html? but this method break it down really hard, not ever that helped anymore: sudo /usr/share/ati/fglrx-uninstall.sh # (if it exists); sudo apt-get remove --purge fglrx*; sudo apt-get remove --purge xserver-xorg-video-ati xserver-xorg-video-radeon ; sudo apt-get install xserver-xorg-video-ati; sudo apt-get install --reinstall libgl1-mesa-glx libgl1-mesa-dri xserver-xorg-core; sudo dpkg-reconfigure xserver-xorg Now i'm here with a fresh install and i cant find anything that works, can someone help me please! I like Ubuntu, but i need to get rid of the lagg, or its Windows 7 only for me then :( Thanks for reading!

    Read the article

  • What to do when you're the interviewer and you don't like your job?

    - by emcb
    I'm in a sorta strange predicament, and I could use some advice. When I was interviewing for my current job, the job description I was given seemed pretty darn nice to me. Without going into the details, the job hasn't quite turned out the way it was advertised. The company is great and takes care of its employees, but for someone who cares about the code they write and the work they do, it's a bad environment - effectively, we operate between 0.5 and 1.0 on the Joel test, and due to political issues we're not going to move beyond that any time soon. Bitter? Maybe. OK...so I'm in the market for a new job. But that's not where my dilemma is. The problem that I see coming is that I will be participating in interviewing some candidates for a position on my team, and I'm not sure what to do. I've heard through the grapevine that we have some really solid, promising, fresh-out-of-college prospects coming in to interview, and I honestly dread the thought of somebody having their first experience of engineering in this department. So I'm wondering: what should I do if/when the interviewee asks me "Do you like your job?" (no) "What kind of projects would I be working on?" (mostly static HTML/CSS changes) Anything else that would elicit a negative answer if told truthfully Do I tell the truth, to give the candidate a real picture of the job? What if this scares them away, and what if it gets blamed on me? Do I fib or lie, saying we work on exciting projects with lots of flexibility, like the pitch my boss will give when the reality is quite different? Should I feel any kind of moral responsibility to let a promising young developer know that this isn't the job for them, or should I shut up and be loyal 100% to the company? Any approaches or advice is appreciated. I hope I don't come across as overly dramatic - I honestly struggle with this question.

    Read the article

  • Upgrade from Linux Mint 12 to Kubuntu 12.04?

    - by MountainX
    Is there an "easy" way to "upgrade" my existing Linux Mint 12 install to Kubuntu 12.04 beta 2? I know I could reinstall. Usually I would do a clean install to avoid unexpected issues. But in this case, I don't have time to reconfigure everything from my printers to my installed software, so I am looking for the quick/easy way, but I also want to avoid big risks of an upgrade gone wrong. I'm hoping to just change some repos and run a few commands from the terminal. I don't mind editing a few config files as long as I can find good HOWTOs. But I don't want to be the pioneer (arrows in back). I'm hoping someone has done this before and has a set of steps. For context, I recently installed KDE 4.8 SC onto Kubuntu 11.10 using PPAs. This was on another computer. That wasn't a problem. But I decided to do a fresh install of Kubuntu 12.04 later. I like it well enough that I want to change my other computer from Linux Mint 12 to Kubuntu. (I'm going all-in with KDE. It's now my desktop of choice.) This Linux Mint upgrade will be a move from Gnome and MGSE to KDE, so that will probably complicate things at bit compared to something like upgrading Kubuntu 11.10 to KDE 4.8. References: http://www.psychocats.net/ubuntu/kde Is it safe to install Kubuntu-desktop in 11.10?

    Read the article

  • 12.04 wired network doesn't work RTL8111/8168B

    - by laket
    its a fresh 12.04 install 64bits. wifi works fine, wired stays off with cable connected and network-manager shows as if cable is disconnected. Turning off networking lights up my network-cards leds, turning networking on shuts off the leds and no communication is possible. I already tried, turning off the network-manager (sudo service network-manager stop) and setting up my eth0 manually, as soon as I switch off the network-manager my leds light up, but after setting up manually eth0 (sudo ifconfig eth0 10.2.10.114 netmask 255.255.0.0 up) the leds turn off again. I am still dual booting with 10.04 where I have no issues at all, leaving the cable connected all time to my notebook and a switch. Here is some hardware info: lshw: *-network description: Ethernet interface product: RTL8111/8168B PCI Express Gigabit Ethernet controller vendor: Realtek Semiconductor Co., Ltd. physical id: 0 bus info: pci@0000:03:00.0 logical name: eth0 version: 03 serial: c8:0a:a9:d7:05:97 size: 10Mbit/s capacity: 1Gbit/s width: 64 bits clock: 33MHz capabilities: pm msi pciexpress msix vpd bus_master cap_list rom ethernet physical tp mii 10bt 10bt-fd 100bt 100bt-fd 1000bt 1000bt-fd autonegotiation configuration: autonegotiation=on broadcast=yes driver=r8169 driverversion=2.3LK-NAPI duplex=half firmware=rtl_nic/rtl8168d-2.fw latency=0 link=no multicast=yes port=MII speed=10Mbit/s resources: irq:42 ioport:2000(size=256) memory:f0004000-f0004fff memory:f0000000-f0003fff memory:f0010000-f001ffff lspci: 02:00.0 Network controller: Atheros Communications Inc. AR9285 Wireless Network Adapter (PCI-Express) (rev 01) 03:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 03) ifconfig eth0: eth0 Link encap:Ethernet HWaddr c8:0a:a9:d7:05:97 inet addr:10.2.10.114 Bcast:10.2.255.255 Mask:255.255.0.0 UP BROADCAST MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Interrupt:42 Base address:0xc000 cat /etc/network/interfaces: (already tried here with and w/o eth0) auto lo eth0 iface lo inet loopback cat /etc/NetworkManager/NetworkManager.conf [main] plugins=ifupdown,keyfile dns=dnsmasq [ifupdown] managed=false Any help is welcome ;) Laket

    Read the article

  • When do domain concepts become application constructs?

    - by Noren
    I recently posted a question regarding recovering a DDD architecture that became an anemic domain model into a multitier architecture and this question is a follow-on of sorts. My question is when do domain concepts become application constructs. My application is a local client C# 4/WPF with the following architecture: Presentation Layer Views ViewModels Business Layer ??? Domain Layer Classes that take the POCOs with primitive types and create domain concepts (e.g. image, layer, etc) Sanity checks values (e.g. image width 0) Interfaces for DTOs Interface for a repository that abstracts the filesystem Data Access Layer Classes that parse the proprietary binary files into POCOs with primitive types by explicit knowledge of the file format Implementation of domain DTOs Implementation of domain repository class Local Filesystem Proprietary binary files When does the MyImageType domain class with Int32 width, height, and Int32[] pixels become a System.Windows.Media.ImageDrawing? If I put it in the domain layer, it seems like implemenation details are being leaked (what if I didn't want to use WPF?). If I put it in the presentation layer, it seems like it's doing too much. If I create a business layer, it seems like it would be doing too little since there are few "rules" given the CRUD nature of the application. I think all of my reading has lead to analysis paralysis, so I thought fresh eyes might lend some perspective.

    Read the article

  • Ubuntu won't suspend anymore, but it did upon install.

    - by Bruce Connor
    I fresh installed Ubuntu 10.10 back when it came out, and my laptop was suspending fine. All of a sudden, I can't get my laptop to suspend anymore. It's an HP Pavilion dv2-1110, but I don't think it's a hardware issue, here's why: It suspended fine upon first install. I haven't installed any new kernels since then, but I have installed tons of packages, so it's probably a package. The suspend and hibernate options disappeared from the shutdown menu. If I press my keyboard's suspend button (or if I close the lid) I get the following message: If I try the command pmi action suspend, I get the error message: Error org.freedesktop.DBus.Error.ServiceUnknown: The name org.freedesktop.Hal was not provided by any .service files. If I try the command echo -n mem > sudo /sys/power/state I get absolutely no output and no visible effect. What might be causing this behavior? I thought a list of installed packages might be useful, but it's huge and I don't know how to post it here in collapse/expand mode or something. EDIT:Just in case someone asks, none of the installed packages are kdm or anything like that (which would justify the lack of options in gnome's shutdown menu).

    Read the article

  • How can I downgrade a system that accidentally had backports installed?

    - by Glyph
    I installed a fresh Ubuntu system. Somehow - possibly through my own error - the backports repository got enabled. Then I did several upgrades. I noticed that this happened when networking suddenly stopped working, "Network Settings" now has an "(alpha)" in the title bar, "System Settings" ? "Network" now displays an error dialog saying "The system network services are not compatible with this version". Now, I've disabled the backports repository, and I'd like to restore my system to its previously-functional state. My question is twofold: How do I determine which packages were installed from backports? Can I automatically re-install all those packages (and purge their configuration) to get back to a sensible state? If the answer to 2 is "no" I can probably manually purge some things and reinstall, but it would be nice to have it handled automatically. Update: It wasn't an update that broke the network; it was apt-get install indicator-network, which installed something called "connman" and removed network-manager and network-manager-gnome. Nevertheless I am leaving the question up, since I am still interested in how I can purge packages from a particular source after accidentally adding that source, and how I can determine which packages were installed from where.

    Read the article

  • An adequate message authentication code for REST

    - by Andras Zoltan
    My REST service currently uses SCRAM authentication to issue tokens for callers and users. We have the ability to revoke caller privileges and ban IPs, as well as impose quotas to any type of request. One thing that I haven't implemented, however, is MAC for requests. As I've thought about it more, for some requests I think this is needed, because otherwise tokens can be stolen and before we identify this and deactivate the associated caller account, some damage could be done to our user accounts. In many systems the MAC is generated from the body or query string of the request, however this is difficult to implement as I'm using the ASP.Net Web API and don't want to read the body twice. Equally importantly I want to keep it simple for callers to access the service. So what I'm thinking is to have a MAC calculated on: the url, possibly minus query string the verb the request ip (potentially is a barrier on some mobile devices though) utc date and time when the client issues the request. For the last one I would have the client send that string in a request header, of course - and I can use it to decide whether the request is 'fresh' enough. My thinking is that whilst this doesn't prevent message body tampering it does prevent using a model request to use as a template for different requests later on by a malicious third party. I believe only the most aggressive man in the middle attack would be able to subvert this, and I don't think our services offer any information or ability that is valuable enough to warrant that. The services will use SSL as well, for sensitive stuff. And if I do this, then I'll be using HMAC-SHA-256 and issuing private keys for HMAC appropriately. Does this sound enough? Have I missed anything? I don't think I'm a beginner when it comes to security, but when working on it I always. am shrouded in doubt, so I appreciate having this community to call upon!

    Read the article

  • Failed to fetch *.deb Size mismatch, then packages with unmet dependencies [solved]

    - by user113907
    I recently bought the wonderfully looking and reviewed Amnesia The Dark Descent and I'm trying to install it. The first time I tried to download it, I had to stop in the middle of the download (may have broken something). The second time I tried to download, at the end of the download it gave me the following error: Failed to fetch https://private-ppa.launchpad.net/commercial-ppa-uploaders/amnesia/ubuntu/pool/main/a/amnesia/amnesia_1.2.1-0ubuntu2_i386.deb Size mismatch Now, whenever I try to download it, it gives me this error: The following packages have unmet dependencies: amnesia: Depends: libalut0 (>= 1.0.1) but it is not going to be installed Depends: libc6 (>= 2.4) but 2.15-0ubuntu10.3 is to be installed Depends: libfontconfig1 (>= 2.8.0) but 2.8.0-3ubuntu9.1 is to be installed Depends: libfreetype6 (>= 2.2.1) but 2.4.8-1ubuntu2 is to be installed Depends: libgcc1 (>= 1:4.1.1) but 1:4.6.3-1ubuntu5 is to be installed Depends: libopenal1 (>= 1:1.13) but 1:1.13-4ubuntu3 is to be installed Depends: libsdl1.2debian (>= 1.2.10-1) but 1.2.14-6.4ubuntu3 is to be installed Depends: libstdc++6 (>= 4.1.1) but 4.6.3-1ubuntu5 is to be installed Depends: libxft2 (> 2.1.1) but 2.2.0-3ubuntu2 is to be installed Depends: zlib1g (>= 1:1.1.4) but 1:1.2.3.4.dfsg-3ubuntu4 is to be installed I already searched the net and ran a few command line commands. Ex: sudo dpkg --configure -a sudo apt-get install -f Or configure the software packages to download from Main instead of the local UK server. But I'm really not figuring out a solution. I have a fresh install of the latest LTS (12.04). The only non-standard thing so far is that I installed gnome-shell (?) because I really can't stand Unity. Help would be much appreciated. I am currently more than entertained enough with World of Goo and Command & Conquer, but I will want to play Amnesia in the close future.

    Read the article

  • Oracle at Information Security and Risk Management Conference (ISACA Conferences)

    - by Tanu Sood
    The North America Information Security and Risk Management (ISRM) Conference hosted by ISACA will be held this year from November 14 - 16 in Las Vegas, Nevada and Oracle is a platinum sponsor. The ISRM / IT GRC event is not only designed to meet the exact needs of information security, governance, compliance and risk management professionals like you, but also gives you the tools you need to solve the issues you currently face. The event builds on and includes the key elements of information security, governance, compliance and risk management practices, and offers a fresh perspective on current and future trends. As a Platinum Sponsor Oracle will not only have an opportunity to demonstrate but talk through our strategic roadmap and support to ensure all organizations understand our key role within the industry to ensure corporate data and information remains safe. Join us at the Lunch and Learn to learn more about the latest advances in Oracle Identity Management. Lunch and Learn Session: Trends in Identity Management Speaker: Mike Neuenschwander, Senior Product Development Director, Oracle Identity Management As enterprises embrace mobile and social applications, security and audit have moved into the foreground. The way we work and connect with our customers is changing dramatically and this means, re-thinking how we secure the interaction and enable the experience. Work is an activity not a place - mobile access enables employees to work from any device anywhere and anytime. Organizations are utilizing "flash teams" - instead of a dedicated group to solve problems, organizations utilize more cross-functional teams. Work is now social - email collaboration will be replaced by dynamic social media style interaction. In this session, we will examine these three secular trends and discuss how organizations can secure the work experience and adapt audit controls to address the "new work order". We also recommend you bookmark the following session: T1 Session 301: Gone in 60 Seconds: Mitigating Database Security Risk Friday, November 16, 8:30 am – 9:30 am And, do be sure to stop by our booth, # 100 & #102, to not only network with our Product Development Team, but also get an onsite demonstration of Oracle Security Solutions. See you there? ISRM /  IT GRC November 14 – 16, 2012 Mirage Casino-Hotel 3400 Las Vegas Boulevard South Las Vegas, NV, 89109

    Read the article

  • Problems with Maverick upgrade

    - by altenuta
    I upgraded to Maverick 10.10 from Lucid. I have an old Toshiba Satellite with a 1.1 MHz and 256MB RAM. Initially I couldn't get my wireless to work. That solved itself after installing various updates and programs. The problems that remain are: I have to authorize at least 2 times at start-up. This machine is Ubuntu only. No boot load screen. I have a ton of programs and system directories that are in my home folder. Is this normal? It is difficult to wake the computer from sleep. Usually I just shut it down and restart. Tonight I waited and got a message about corrupt memory. The computer takes forever to do just about everything. Slow to start programs or doing things on the web. I am a longtime Mac user (since 1986). I also manage a network of several windoze machines. I am definitely a GUI guy and do very little in the terminal so I really need to know where to begin to get things straightened out. Can I rescue this machine without wiping it and doing a fresh install? This is basically a hobby machine. Aside from all the programs and upgrades I've installed, I have almost no files or documents to worry about saving. Anyone have any ideas about the problems I'm having and the best way to proceed? Thanks, Al

    Read the article

  • Is there such a thing as "closure" with software work?

    - by Bobby Tables
    I burned out last year (after a decade of fulltime programming jobs) and am on a sabbatical now. With all the self-examination I've started to figure out some of the root causes of my burnout, and one of the major ones is basically this: there was never any real closure in any of the work I've ever done. It was always a case of getting into an open-ended support/maintenance grind and going stale. When I first entered the industry, I had this image of programming work being very project-based. And I expected projects to have a start, beginning, and END. And then you move on and start on something totally new and fresh. Basically I never expected that a lot (most) of software work involves supporting and maintaining the same code base for open-ended long periods of time - years and even decades. That, combined with generally having itchy feet makes me think that burnout is inevitable for me, after 2-3 years, in ANY fulltime software job. All this sounds like I probably should have been a contractor instead of a fulltimer. But when I discuss this with people, a lot of them say that even THEN you can't really escape having to go back and maintain/support the stuff you worked on, over and over (eg. Coming back on support contracts, for example). The nature of software work is simply like that. There is no project closure, unlike in many other engineering fields. So my question is - Is there ANY programming work out there which is based on short to mid term projects/stints and then moving on cleanly? And is there any particular industry domain or specialization where this kind of project work is typical?

    Read the article

  • nomachine NX: Text missing on all gtk interface (Unity and Gnome Classic)

    - by hansioux
    [Edit] I later realized my issue only occurs when I am using NX to remote access my machine. Therefore I edited the title and description. I have also found the temp solution, which is to "disable render extension" in the custom display settings. But doing so makes the NX experience very slow laggy, and not that nice to look at. [/EDIT] I did a fresh install on a new computer, and was trying to setup my fonts. When I log in remotely via NX, my the text are missing on all gtk based interfaces. That means most menues (except for unity), right click menues, applications themselves, terminal, and so on. About the only thing unaffected is firefox. all the texts are showing just fine for firefox. So that probably already says something about text permissions. I went to check if my fonts have the correct permissions and they do. I removed my custom settings from /etc/fonts/config.d, and still the texts are missing. There is a work around by using "disable render extension" in the custom display settings. How do I fix this issue permanently?

    Read the article

  • What You Said: How You Find New Books

    - by Jason Fitzpatrick
    Earlier this week we asked you to share your tips and tricks for finding fresh books to enjoy. Now we’re back with tips ranging from the old school to the digital. SJ highlights several of the most popular web-based tools for finding new books: Goodreads.com is quick and easy. Yournextread.com is fun and helps a lot. But I gotta be honest, Amazon’s suggestions are probably the most useful to me. TheFu suggests checking out award-winning lists and one rather quirky way to pick a good Sci-Fi book: For scifi, see Hugo winning books. Life is too short to read bad books. Sometimes that leads to an author with an entire series of books to enjoy. I really enjoy some of the scifi from the 40s and 50s. Wells stuff is always timeless too (and free). I’m less happy with Nebula winners–-different type of writers and not my personal taste. Secure Yourself by Using Two-Step Verification on These 16 Web Services How to Fix a Stuck Pixel on an LCD Monitor How to Factory Reset Your Android Phone or Tablet When It Won’t Boot

    Read the article

  • Acer Aspire One 725 - missing graphic card driver for Radeon HD 7290?

    - by Melon
    Recently I bought an Acer Aspire One 725 Netbook and installed Ubuntu 12.10 on it. I bought it, because it can run HD movies and has Full HD on external VGA port. However, movies from youtube have a really slow framerate. If you open three tabs in Opera (for example g-mail, youtube and askubuntu) it gets really laggy. My suspicion is that the driver for graphic card is missing. When I check the System->Details->Graphics the driver is unknown. After running lspci | grep VGA I get this output: 00:01.0 VGA compatible controller: Advanced Micro Devices [AMD] nee ATI Device 980a From what I see, I have a AMD C70 processor integrated with AMD Radeon HD 7290. Has anyone had the same problem? Do you know which drivers need to be installed for the graphics to work properly? On official Acer page there are only drivers for Win7 and Win8... Update: OK. Another attempt. I have a fresh Ubuntu 12.10. All updates done. downloaded Catalyst 12.11 beta drivers and decided to create a package. After installing package, I have this error from /var/log/Xorg.0.log: [ 13.394] (**) fglrx(0): NoAccel = NO [ 13.394] (**) fglrx(0): AMD 2D Acceleration Architecture enabled [ 13.394] (--) fglrx(0): Chipset: "AMD Radeon HD 7290 Graphics" (Chipset = 0x980a) [ 13.394] (--) fglrx(0): (PciSubVendor = 0x1025, PciSubDevice = 0x0740) [ 13.394] (==) fglrx(0): board vendor info: third party graphics adapter - NOT original AMD [ 13.394] (--) fglrx(0): Linear framebuffer (phys) at 0xe0000000 [ 13.394] (--) fglrx(0): MMIO registers at 0xf0200000 [ 13.394] (--) fglrx(0): I/O port at 0x00003000 [ 13.394] (==) fglrx(0): ROM-BIOS at 0x000c0000 [ 13.484] (II) fglrx(0): ATIF platform detected [ 13.564] (II) fglrx(0): AC Adapter is used [ 13.565] (EE) fglrx(0): V_BIOS address 0xd00 out of range [ 13.565] (EE) fglrx(0): Failed to obtain VBIOS from Kernel! [ 13.565] (EE) fglrx(0): VBIOS read from Kernel, Invalid signature! [ 13.565] (EE) fglrx(0): GetBIOSParameter failed [ 13.565] (EE) fglrx(0): PreInitAdapter failed [ 13.565] (EE) fglrx(0): PreInit failed [ 13.565] (II) fglrx(0): === [xdl_xs113_atiddxPreInit] === end

    Read the article

  • Introducing a (new) test method to a team

    - by Jon List
    A couple of months ago i was hired in a new job. (I'm fresh out of my Masters in software engineering) The company mainly consists of ERP consultants, but I was hired in their fairly small web department (6 developers), our main task is ERP/ecom integration (ERP-integrated web shops). The department is growing, and recently my manager asked me to start thinking about introducing tests to the team, i love a challenge, but frankly I'm a bit scared (I'm the least experience member of the team). Currently the method of testing is clicking around in the web shop and asking the customer if the products are there, if they look okay, and if orders are posted correctly to the ERP. We are getting a lot of support cases on previous projects, where a customer or a customer's customer have run into errors, which - i suppose - is why my manager wants more structured testing. Off the top of my head, I though of some (obvious?) improvements, like looking at the requirement specification, having an issue tracker, enabling team members to register their time on a "tests"-line on the budget, and to circulate tasks amongst members of the team. But as i see it we have three main challenges: general website testing. (javascript, C#, ASP.NET and CMS integration tests) (live) ERP integration testing (customers rarely want to pay for test environments). adopting a method in the team I like the responsibility, but I am afraid that I'm in a little bit over my head. I expect that my manager expects me to set up some kind of workshop for the team where I present some techniques and ideas and where we(the team) can find some solutions together. What I learned in school was mostly unit testing and program verification, not so much testing across multiple systems and applications. What I'm looking for here, is references/advice/pointers/anecdotes; anything that might help me to get smarter and to improve the current method of my team. Thanks!! (TL;DR: read the bold parts)

    Read the article

  • Struggles to connect to network when using WPA with a BCM43225

    - by pst007x
    When booting my laptop, it will try to connect to my wireless network, however a window keeps popping up asking me for my security password, which has already been saved. I have to keep deleting my network settings, and reconnecting, otherwise it keeps failing to connect. My wireless is set up with a WPA, I do not want to lower my security because of this, but it is a pain and can take me 15mins plus to finally connect. The problem has only become apparent since a fresh install of 11.10. IPV6 disabled. System info: 01:00.0 Ethernet controller: Broadcom Corporation NetLink BCM57780 Gigabit Ethernet PCIe (rev 01) Subsystem: Acer Incorporated [ALI] Device 036d Flags: bus master, fast devsel, latency 0, IRQ 43 Memory at b3400000 (64-bit, non-prefetchable) [size=64K] Capabilities: <access denied> Kernel driver in use: tg3 Kernel modules: tg3 02:00.0 Network controller: Broadcom Corporation BCM43225 802.11b/g/n (rev 01) Subsystem: Broadcom Corporation Device 04da Flags: bus master, fast devsel, latency 0, IRQ 17 Memory at b2400000 (64-bit, non-prefetchable) [size=16K] Capabilities: <access denied> Kernel driver in use: brcmsmac Kernel modules: wl, brcmsmac ADDITIONAL: In terminal I get this: pst007x@pst007x-ubuntu64:~$ nm-applet start ** Message: applet now removed from the notification area ** (nm-applet:2816): DEBUG: old state indicates that this was not a disconnect 0 ** Message: using fallback from indicator to GtkStatusIcon ** Message: applet now embedded in the notification area ** Message: No keyring secrets found for Auto Access 01/802-11-wireless-security; asking user. ** (nm-applet:2816): DEBUG: foo_client_state_changed_cb Note this line: ** Message: No keyring secrets found for Auto Access 01/802-11-wireless-security; asking user. At this point is where I am asked for the password. Please report WPA issues with Ubuntu 11.10 here: https://bugs.launchpad.net/ubuntu/+source/network-manager/+bug/892727

    Read the article

  • Ubuntu 11.10 logs off when clicking shutdown

    - by Rourke
    As the title says: when I click Shutdown from the menu it logs off. When I click shutdown from the log-in menu it does nothing. I'm using a fresh install of Ubuntu 11.10. I can force it to shutdown by the command below, but I don't want to keep typing that whenever I want to shutdown my laptop. sudo shutdown -h now So it's probably processes which arn't closing. I'm a novice linux user, so I have no idea how to rule out the software causing this. I think it's either Gwibber/Empathy, perhaps Mozilla Thunderbird, because this is happening since I started using this. So a few questions: How do I rule out what software is causing this? How do I stop it from not closing on shutdown? If 1. and 2. don't work is it possible to add top command to the shutdown process? Edit: Rourke here. Somehow I cannot accept the below comment from mech-e as the solution. Thank you this was indeed the answer I was looking for!

    Read the article

  • Touchpad stopped working on an Acer AspireOne D255E

    - by Gustavo
    I have a less than a year old Acer AspireOne D255E with a fresh install of Ubuntu 11.10. Everything has been working fine. Great OS and great netbook. But today the touchpad stopped working. It had been working fine. I closed the netbook and when I opened it again, a couple of hours later, I could not move the pointer with the touchpad. I can not get the pointer to move. I cleaned the touchpad surface well, just in case. Everything else is working fine. All the software updates are up-to-date. I have rebooted several times with no solution. I have attached an USB mouse and the pointer works well. What can I do to troubleshoot the problem? I have gone to the System Settings, touchpad section, and there is not much that I can do there. I would like to determine first if it is a hardware or software issue and then how to resolve it. Is there a way that I can reinstall the touchpad drivers. just in case it is a software problem? I have been using Ubuntu for nearly a year now and am very happy with it. Are there any wise Ubuntu gurus out there who can help me? Thank you for reading this note.

    Read the article

  • Unable to remove the lock by normal means

    - by Loki
    I've been installing ubuntu restricted extras via the software center. Everything was going well at first, but then the installation process froze on 'applying changes' stage. I've had this in the past already, and usually just hitting the 'cancel' button helped, but not this time. Obviously, the install process has placed a lock, and I couldn't issue any apt-get commands. then i've tried doing what was suggested here Fixing Could not get lock /var/lib/dpkg/lock : sudo fuser -cuk /var/lib/dpkg/lock; sudo rm -f /var/lib/dpkg/lock but it seemed to me that it has only killed my X server. Okay, i've just pressed the power button on my PC, and restarted, hoping that the lock was finally off and i could reinstall the stuff. No dice. when I open the software center, I still have one operation in process, a weird one: " Searching | Cancelling ". The 'cancel' button is either inactive, or it just does nothing. So I've become desperate and decided to write here. How do I fix the problem? Can't install anything on a fresh ubuntu 12.04 :) Thanks in advance

    Read the article

  • Setting up Cluster Configuration using an existing web server as a Primary Node?

    - by RapidWebs
    Thanks in advance for any help which is issued! I am having a slight issue, and need help with the decision making process when it comes to setting up my Cluster Configuration, consisting on a line of Ubuntu Servers (12.04). We currently have a Primary node, which resides in the US within a Datacenter, but we are going to be using this for all serious bandwidth and resource intensive websites, and through a configuration of Virtualmin + Webmin, will be setup as a sort of pseudo-cluster, using Virtualmins Cluster Modules. Anyways, on to the issue: We also have a business line setup locally, with three servers. here are their specs: Intel P4 2.4 ghz, 1GB Ram, 110 gb sata, Ubuntu 12.04* AMD 1.3 ghz, 512MB Ram, 20 GB IDE P3 Xeon 800mhz (dual physical processors), 1GB Ram, 3 * 25 GB Raid Configuration (one in use for host operating system). The first machine is currently IN USE and is serving virtual hosts off a sub-domain. My question is this: How can I integrate the Secondary node (which will be the Primary node per say, in this smaller configuration...) which is currently in use, into the cluster configuration w/ the other two servers for: Sharing Resources Redundancy (HA?) NFS /w the two Raid Disks without having the FORMAT the secondary node, and start fresh moving all my services in to a DRBD network drive or something similar, and than restoring all active virtualmin's Virtual hosts. the idea is that I want minimal downtime to people currently being served from server2.mywebsite.com, and from what I understand, all services need to be on a NFS so that they can be mounted on demand and accessed from the other machine taking over (i.e. Heartbeat + DRBD Config.) but my issue is that i already have all these services installed to their default directory structure: how can i most easily setup this NFS and HA system, move all my desires services to this new drive, and do it with minimal down time, and without breaking Virtualmin and everything else on my server? even just some pointers, a thread i could read, or a step by step check list or run down of commands i could issue to get started would be great! thanks!

    Read the article

< Previous Page | 45 46 47 48 49 50 51 52 53 54 55 56  | Next Page >