Search Results

Search found 1253 results on 51 pages for 'lucifer sam'.

Page 8/51 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Bluetooth fix for Ubuntu 12.04, lenovo G580

    - by Sam Abraham
    Bluetooth not working, it shows turned on but manager indicated Bluetooth disabled. Uninstalled default manager and installed Blueman. The same with Blueman, clicking on connect to devices gets response 'adapters not found'. I've found many more people with the same problem. The fixes found in the archive don't work for me. I've tried a couple of things from the forum. I'm not familiar with computer hardware or software but have been using Ubuntu cause it saves me money, it's fairly easy to use and it does not tax my mid-range lap. Any help will be appreciated.

    Read the article

  • How to convince boss to buy Visual Studio 2012 Professional

    - by Sam Leach
    The main advantage is the use of ReSharper and other add-ons but we need to make a convincing argument for the purchase of Visual Studio 2012 Professional. We are currently using Visual Studio 2012 Express for Windows. It is quite good but is hard to switch from using the full Professional version in the past. So far the team has compiled the following list: Extract Interface function missing. Very useful for clean SOLID code. No add-on support. Can’t install StyleCop or productivity tools. AnkhSvn, Spell checker, Productivity PowerTools, GhostDoc, Regex Editor, PowerCommands. The exception assistant is limited in Express edition. This is a big annoyance. See http://www.lifehacker.com.au/2013/01/ive-given-up-on-visual-studio-express-2012-for-windows-desktop-heres-why/ Different tools provided by MS like certificate generation. Possibility of create a Test project based on source code. We do server development in C# so any web add-ons or anything else is useless. The reason I am asking is I am sure that people have been in the same position. What approach did you use and can you think of additions or ammends to the above list? Thanks,

    Read the article

  • SEO Pros and cons of having your blog in a subdirectory or subdomain

    - by sam
    From an SEO point of view is it better to have your blog running as part of your site (ie. /blog) so that it will be generating more content for the site OR is it better to have it running as a subdomain (ie. blog.) of your main site (correct me if im wrong but google sees subdomains as seperate site ?) so that it would be getting lots of external links from my blog, but then again, it would be generating no extra content for my main site.

    Read the article

  • Propagaging Apache rules Automatically for all folders Beneath Root

    - by Sam
    Hi folks, my webroot folder /httpdocs folder contains a .htaccess file The first lines look like this: RewriteEngine on RewriteBase / AddDefaultCharset UTF-8 Options +FollowSymLinks -Indexes -ExecCGI # DirectoryIndex index.php /index.php # ServerSignature Off Now, I want all settings that I have set it to, to be propagated automatically to other folders as well. How can I do that? Thanks very much for suggestions.

    Read the article

  • What would be a best way to code a rudimentary LIMS?

    - by Sam
    Hi, I hope that I am asking the right question at right place. If not, please do not hesitate to point me to the right place. I am working in a laboratory, where computers/programming/programmers are all used only as support. But I believe that we need better information organization and management. Given the diverse information generated in the lab I believe that we need some sort of Laboratory Information Management System. Is there a good starting point to do this? I am myself not a trained programmer. I can code reasonably well in python and R, and if necessary I can learn MySql, php.

    Read the article

  • New META TAGS with positive effects for seo ranking in 2011 and beyond

    - by Sam
    Hi all, im trying to make an up to date chart of meta tags, for all of us, with their purposes, their use and their good (or bad) effects on search engines/being found. Also any body knows new/promising meta tags? I will add yours into my list so this chart is a result of live discussion and up to date. Also, it would be creative to invent your own useful meta, because we are the ones making the web, or aren't we? LEGEND P PURPOSE? What does this meta tag do in 2011, if anything N NECESSARY? Does every site really needs it or not? G GOOD wether it will have a good effect for your site to be found I INVENTED meta tag, who knows it will be accepted in a year! META "METANAME" = PURPOSE? - NECESSARY? - GOOD EFFECT? #### important meta "title" = P consice summary + teaser - N very - G extremely meta "description" = P description + teaser - N yes - G very meta "robots" = P if needed, to skip default dmoz/yahoodir listing - N no - G? #### new & promising! Thanks for input (John, ) meta "original-source" P url of whoever broke the news gets credits - N? - G? meta "syndication-source" P url for syndication of published news - N? - G? meta "canonical" P? - N? - G? #### seems obsolete meta "keywords" = P some keywords - N+G not for google but yahoo likes them meta "language" = P overrule guesswork by defining language - N no - G? meta "page-topic" = P topic/theme - N? - G? meta "abstract" = P short summary - N? - G? meta "copyright" = ? #### invented by me meta "audience" = P filteres audience: "+seniors, +parents, -children, -youth" meta "mood" = P specifies textual style: "discussion, informative, commercial, sexual, fictional, scientific, romantic, therapeutic, technical"

    Read the article

  • getting 500 intenal error when setting 301 redirect using .htaccess

    - by sam
    im trying to use a 301 redirect to direct users and bots to my new site but when i put the .htaccess live i keep getting a 500 internal error shown. The site is actually a subdomain which i want to redirect to another subdomain on another site (im not sure if thats relivant but i thought i should include it) the site is hosted on a apache server The 301 htaccess code im using is : Options +FollowSymLinks RewriteEngine on RewriteRule (.*) http://www.blog.mysite.co.uk/$1 [R=301,L] any idea what might be wrong with this ?

    Read the article

  • Upgrading from MVC 1.0 to MVC2 in Visual Studio 2010 and VS2008.

    - by Sam Abraham
    With MVC2 officially released, I was involved in a few conversations regarding the feasibility of upgrading existing MVC 1.0 projects to quickly leverage the newly introduced MVC features. Luckily, Microsoft has proactively addressed this question for both Visual Studio 2008 and 2010 and many online resources discussing the upgrade process are a "Bing/Google Search" away. As I will happen to be speaking about MVC2 and Visual Studio 2010 at the Ft Lauderdale ArcSig .Net User Group Meeting on April 20th 2010 (Check http://www.fladotnet.com for more info.), I decided to include a quick demo on upgrading the NerdDinner project (which I consider the "Hello MVC World" project) from MVC 1.0 to MVC2 using Visual studio 2010 to demonstrate how simple the upgrade process is. In the next few lines, I will be briefly touching on upgrading to MVC2 for Visual Studio 2008 then discussing, in more detail, the upgrade process using Visual Studio 2010 while highlighting the advantage of its multi-targeting support. Using Visual Studio 2008 SP1 For upgrading to MVC2 Using VS2008 SP1, a Microsoft White Paper [1] presents two approaches:  1- Using a provided automated upgrade tool, 2-Manually upgrading the project. I personally prefer using the automated tool although it comes with an "AS IS" disclaimer. For those brave souls, or those who end up with no luck using the tool, detailed manual upgrade steps are also provided as a second option. Backing up the project in question is a must regardless of which route one would take to upgrade. Using Visual Studio 2010 Life is much easier for developers who already adopted Visual Studio 2010. Simply opening the MVC 1.0 solution file brings up the upgrade wizard as shown in figures 1, 2, 3 and 4. As we proceed with the upgrade process, the wizard requests confirmation on whether we choose to upgrade our target framework version to .Net 4.0 or keep the existing .Net 3.5 (Figure 5). VS2010 does a good job with multi-targeting where we can still develop .Net 3.5 applications while leveraging all the new bells and whistles that VS2010 brings to the table (Multi-targeting enables us to develop with as early as .Net 2.0 in VS2010) Figure 1 - Open Solution File Using VS2010   Figure 2 - VS2010 Conversion Wizard Figure 3- Ready To Convert To VS2010 Confirmation Screen Figure 4 - VS2010 Solution Conversion Progress Figure 5 - Confirm Target Framework Upgrade In an attempt to make my demonstration realistic, I decided to opt to keep the project targeted to the .Net 3.5 Framework.  After the successful completion of the conversion process,  a quick sanity check revealed that the NerdDinner project is still targeted to the .Net 3.5 framework as shown in figure 6. Inspecting the Web.Config revealed that the MVC DLL version our code compiles against has been successfully upgraded to 2.0 (Figure 7) and hence we should now be able to leverage the newly introduced features in MVC2 and VS2010 with no effort or time invested on modifying existing code. Figure 6- Confirm Target Framework Remained .Net 3.5  Figure 7 - Confirm MVC DLL Version Has Been Upgraded In Conclusion, Microsoft has empowered developers with the tools necessary to quickly and seamlessly upgrade their MVC solutions to the newly released MVC2. The multi-targeting feature in Visual Studio 2010 enables us to adopt this latest and greatest development tool while supporting development in as early as .Net 2.0. References 1. "Upgrading an ASP.NET MVC 1.0 Application to ASP.NET MVC 2" http://www.asp.net/learn/whitepapers/aspnet-mvc2-upgrade-notes

    Read the article

  • wordpress sites are slow on shared hosting but plain html/css sites are fast

    - by sam
    ive got a shared hosting account, unlimited sites, unlimited gb, unlimited bandwidth ect ect. Of course because its shared and a cheap one at that theres too many sites on each server and it all runs slow due to lack of ram. What ive found is that my plain html/css/js sites run an awful lot faster than my wordpress sites on this hosting and i was trying to work out why. Im not exactly sure how a browser sends a request for a page and the full process of request and delivery, but are my html sites running faster as they are just serving code to the browser, where as the wordpress sites are having to make calculations from the database to make each page before its delivered .. is that correct, or am i completly off course ?

    Read the article

  • export emails from open-xchnage webmail and import them to horde webmail

    - by sam
    Im moving hosting and need to move the emails stored with the old domain/hosting, the hosting i want to transfer from uses open-xchange for its webmail and the hosting i want to move to uses horde for their webmail. Is their an automated way from inside of the open xchange webmail, or is there a way i can do it from inside of outlook / mac mail ? not sure if this is the right place for this, if not please advise..

    Read the article

  • Recaptcha php problem [closed]

    - by Sam Gabriel
    Hey guys, I'm using recaptcha and I've got a problem, when a user clicks the signup button it redirects him to the sign up verification page, and here is the code, found on the very top of the web code, that checks the recaptcha entered data. <?php require_once('recaptchalib.php'); $privatekey = "***"; $resp = recaptcha_check_answer ($privatekey, $_SERVER["REMOTE_ADDR"], $_POST["recaptcha_challenge_field"], $_POST["recaptcha_response_field"]); if (!$resp->is_valid) { header('location: signup.php'); } ?> But it seems whatever I type in in the recaptcha box, be it right or wrong, I get redirected to the signup.php page. Here is the recaptcha code in the signup.php page: <?php ini_set('display_errors', 'On'); error_reporting(E_ALL | E_STRICT); require_once('recaptchalib.php'); $publickey = "***"; echo recaptcha_get_html($publickey); ?>

    Read the article

  • What's the best practice for linking to/from guest posts on other blogs?

    - by sam
    When writing a guest blog for a site, I include a link back to my site - an inbound link. If I were to write a post on my blog publicising my guest post, that would mean there were reciprocal links, thus cancelling each other out, correct? If I made the link (on my blog) back to the guest post nofollow, would that cancel the effect, meaning I still get link juice from the guest post? Further down the line if the site I posted on as a guest wanted to write a post for my site, what is the best way for me to prevent re-introducing the reciprocal link problem?

    Read the article

  • Singular or Plural Nouns as file names for better Search & SEO friendlyness? [closed]

    - by Sam
    Possible Duplicate: Should I use singular or plural nouns in a domain name and why? Dear folks, two scenarios where file names should be best representing the search volume by audiences searching for it. Scenario 1 website.org/en/logo.php website.org/en/brochure.php website.org/en/poster.php website.org/en/design.php OR Scenario 2 website.org/en/logos.php website.org/en/brochures.php website.org/en/posters.php website.org/en/designs.php Q1. What do you intuitivly think would be the best? Q2. What do the facts in general show? people search for singular or plural in search? Q3. Do Search engines have common rule of thumb for this? Q4. Should I pick either and go with either scenario consistently or does it depend on the word? Thanks very much for your ideas/suggestions. I reall don't know which one to go for.

    Read the article

  • What is a better abstraction layer for D3D9 and OpenGL vertex data management?

    - by Sam Hocevar
    My rendering code has always been OpenGL. I now need to support a platform that does not have OpenGL, so I have to add an abstraction layer that wraps OpenGL and Direct3D 9. I will support Direct3D 11 later. TL;DR: the differences between OpenGL and Direct3D cause redundancy for the programmer, and the data layout feels flaky. For now, my API works a bit like this. This is how a shader is created: Shader *shader = Shader::Create( " ... GLSL vertex shader ... ", " ... GLSL pixel shader ... ", " ... HLSL vertex shader ... ", " ... HLSL pixel shader ... "); ShaderAttrib a1 = shader->GetAttribLocation("Point", VertexUsage::Position, 0); ShaderAttrib a2 = shader->GetAttribLocation("TexCoord", VertexUsage::TexCoord, 0); ShaderAttrib a3 = shader->GetAttribLocation("Data", VertexUsage::TexCoord, 1); ShaderUniform u1 = shader->GetUniformLocation("WorldMatrix"); ShaderUniform u2 = shader->GetUniformLocation("Zoom"); There is already a problem here: once a Direct3D shader is compiled, there is no way to query an input attribute by its name; apparently only the semantics stay meaningful. This is why GetAttribLocation has these extra arguments, which get hidden in ShaderAttrib. Now this is how I create a vertex declaration and two vertex buffers: VertexDeclaration *decl = VertexDeclaration::Create( VertexStream<vec3,vec2>(VertexUsage::Position, 0, VertexUsage::TexCoord, 0), VertexStream<vec4>(VertexUsage::TexCoord, 1)); VertexBuffer *vb1 = new VertexBuffer(NUM * (sizeof(vec3) + sizeof(vec2)); VertexBuffer *vb2 = new VertexBuffer(NUM * sizeof(vec4)); Another problem: the information VertexUsage::Position, 0 is totally useless to the OpenGL/GLSL backend because it does not care about semantics. Once the vertex buffers have been filled with or pointed at data, this is the rendering code: shader->Bind(); shader->SetUniform(u1, GetWorldMatrix()); shader->SetUniform(u2, blah); decl->Bind(); decl->SetStream(vb1, a1, a2); decl->SetStream(vb2, a3); decl->DrawPrimitives(VertexPrimitive::Triangle, NUM / 3); decl->Unbind(); shader->Unbind(); You see that decl is a bit more than just a D3D-like vertex declaration, it kinda takes care of rendering as well. Does this make sense at all? What would be a cleaner design? Or a good source of inspiration?

    Read the article

  • Work Item Traceability in TFS 2010

    - by Sam Patrick
    I have created a Windows Form project (VS solution) under a TFS 2010 project. I may eventually add more solutions to the TFS project. My question: Can we create a Use Case WIT for a specific solution within a TFS project? Furthermore, is it possible to create a "traceability matrix" that starts at the Use Case level and goes down to the the code level (at least the namespace level) of that particular VS solution?

    Read the article

  • Advice on designing a robust program to handle a large library of meta-information & programs

    - by Sam Bryant
    So this might be overly vague, but here it is anyway I'm not really looking for a specific answer, but rather general design principles or direction towards resources that deal with problems like this. It's one of my first large-scale applications, and I would like to do it right. Brief Explanation My basic problem is that I have to write an application that handles a large library of meta-data, can easily modify the meta-data on-the-fly, is robust with respect to crashing, and is very efficient. (Sorta like the design parameters of iTunes, although sometimes iTunes performs more poorly than I would like). If you don't want to read the details, you can skip the rest Long Explanation Specifically I am writing a program that creates a library of image files and meta-data about these files. There is a list of tags that may or may not apply to each image. The program needs to be able to add new images, new tags, assign tags to images, and detect duplicate images, all while operating. The program contains an image Viewer which has tagging operations. The idea is that if a given image A is viewed while the library has tags T1, T2, and T3, then that image will have boolean flags for each of those tags (depending on whether the user tagged that image while it was open in the Viewer). However, prior to being viewed in the Viewer, image A would have no value for tags T1, T2, and T3. Instead it would have a "dirty" flag indicating that it is unknown whether or not A has these tags or not. The program can introduce new tags at any time (which would automatically set all images to "dirty" with respect to this new tag) This program must be fast. It must be easily able to pull up a list of images with or without a certain tag as well as images which are "dirty" with respect to a tag. It has to be crash-safe, in that if it suddenly crashes, all of the tagging information done in that session is not lost (though perhaps it's okay to loose some of it) Finally, it has to work with a lot of images (10,000) I am a fairly experienced programmer, but I have never tried to write a program with such demanding needs and I have never worked with databases. With respect to the meta-data storage, there seem to be a few design choices: Choice 1: Invidual meta-data vs centralized meta-data Individual Meta-Data: have a separate meta-data file for each image. This way, as soon as you change the meta-data for an image, it can be written to the hard disk, without having to rewrite the information for all of the other images. Centralized Meta-Data: Have a single file to hold the meta-data for every file. This would probably require meta-data writes in intervals as opposed to after every change. The benefit here is that you could keep a centralized list of all images with a given tag, ect, making the task of pulling up all images with a given tag very efficient

    Read the article

  • Juju didn't configure rabbitmq for openstack?

    - by SaM
    I have installed ubuntu Openstack HA with juju with all 24 servers. But my openstack is not working at all. On dashboard on every page I get errors saying "could not retrieve usage information", "could not retrieve volume information, "could not retrieve .....etc I spent hours and have discovered that juju has not done configuration correctly. I found that on cloud controller in nova.conf juju has added rabitmq vhost enrty, but that virtual host is not added in rabbitmq. Then how is its suppose to work? And on juju-gui canvas rabbimq is all green and is working fine, which in reality its not. I am really wondering if juju has really done correct configuration in all 24 servers now, I am getting the feeling that it would have been faster if I would have done openstack deployment manually instead of using juju. Why was the virtual host entry not added in rabbitmq? How should I solve this?

    Read the article

  • backlink anchor text / keyword stratergy post penguin

    - by sam
    Ive heard allot recently about over optimisation regarding backlink anchor text ofsite. What ive heard from seomoz was sites most effected by the penguin update had over 60% of their backlinks anchor text the same, so google saw this is unnatural and penalized them. Which kind of makes sense as its not normal to have such high density on one word / phrase. If i where building links for a gastro pub in london. (this is purely hypothetical). The sort of keywords i would go after are "gastro pub in london" and "london gastro pub" If i where to mix up the anchor text by having: gastro pub gastro pub in london london gastro pub would this be seen as ok ? or would these all be seen as broad match keywords and counted as one phrase, making me fall foul of the penguin update ?

    Read the article

  • Network really slow with TL-WN951N wireless card

    - by Sam
    I literally just installed ubuntu and it seems to be working great except the network is deadly slow. I'm running a TL-WN951N wireless card which can download at about 600-700 KB/s in windows but in Ubuntu the max speed it seems to get is around 5KB/s. I guess I should note that my WAP is only wireless-G but like I said, I can get much better speeds in Windows. I'm testing the speeds by downloading files from here: http://mirror.internode.on.net/pub/test/ Anyone have any idea? I saw some people recommend downloading drivers and compiling them myself but I'm really really new to all of this so would appreciate someone babying me through it so I don't brick my computer! Here are the results of lspci -v: 04:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 02) Subsystem: Giga-byte Technology GA-EP45-DS5 Motherboard Flags: bus master, fast devsel, latency 0, IRQ 44 I/O ports at c000 [size=256] Memory at e9110000 (64-bit, prefetchable) [size=4K] Memory at e9100000 (64-bit, prefetchable) [size=64K] [virtual] Expansion ROM at e9120000 [disabled] [size=64K] Capabilities: <access denied> Kernel driver in use: r8169 Kernel modules: r8169 05:02.0 Network controller: Atheros Communications Inc. AR5008 Wireless Network Adapter (rev 01) Subsystem: Atheros Communications Inc. Device 3071 Flags: bus master, 66MHz, medium devsel, latency 168, IRQ 18 Memory at e9200000 (32-bit, non-prefetchable) [size=64K] Capabilities: <access denied> Kernel driver in use: ath9k Kernel modules: ath9k

    Read the article

  • Setting up page goals in Analytics when using progressive enhancement to load content using jquery .load

    - by sam
    I'm using jQuery .load to load content in from other pages into my homepage, so that Google can still see whats going on I've made the <a> tags go to the pages but over ride them in the JavaScript so instead of going the that page it just loads in the content from that page to the main page. Normaly I would just make the page /contact.html a goal. Can I still get it to work as a goal if the content is being loaded in? Can I do something like when the user clicks <a href="contact.html" id="load-contact">contact</a> it logs the clicking of the <a> tag as a goal, rather than the actaul page being visited?

    Read the article

  • The Basics of Project Management / Software Development

    - by Sam
    It suddenly struck me today that I have never developed any large application or worked with a team of programmers, and so am missing out a lot - both in terms of technical knowledge and the social-fun part of it. And I would like to rectify that - an idea is to start an open source group by training college students (for no charge) and developing some open source application with them. Please give me some basic advice on the whole process of how to (1) plan and (2) manage projects in a team. What new skill sets would you recommend? (I have read joel on software and 37 Signals, and got many insightful tips from them. But I'd like a little more technical knowledge ...) Background (freelancer, past 4+ years) - Computer engineer graphic / web designer online marketing moved on to programming in PHP, Perl, Python did Oracle DBA OCP training to understand DB's current self-assigned title - web application developer.

    Read the article

  • Meaningful concise method naming guidelines

    - by Sam
    Recently I started releasing an open source project, while I was the only user of the library I did not care about the names, but know I want to assign clever names to each methods to make it easier to learn, but I also need to use concise names so they are easy to write as well. I was thinking about some guidelines about the naming, I am aware of lots of guidelines that only care about letters casing or some simple notes. Here, I am looking after guidelines for meaningful concise naming. For example, this could be part of the guidelines I am looking after: Use Add when an existing item is going to be added to a target, Use Create when a new item is being created and added to a target. Use Remove when an existing item is going to be removed from a target, Use delete when an item is going to be removed permanently. Pair AddXXX methods with RemoveXXX and Pair CreateXXX methods with DeleteXXX methods, but do not mix them. The above guidance may be intuitive for native English speakers, but for me that English is my second language I need to be told about things like this.

    Read the article

  • Unindexing my tumblr blogs content and moving it to another tumblr blog

    - by sam
    ive been writing a tumblr blog for the past yr or so, ive writen about 300 articles, but now i need to move the blog to another site. (before it was running under blog.mysite.com and i now want it to run under blog.my*new*site.com) I want to keep the archived articles and have them on the new site, so what i was hoping to do was export the blog from tumblr, go into webmaster tools remove all the blogs indexed urls from google webmaster, then make a new tumblr blog and import the posts. Would google see this as new content as ive deleted their indexed copy ? Could i just move the mapping of the tumblr blog to the new subdomain, but in doing this i would lose all the pr and it would still look like duplicate content whats the best way to approach this ?

    Read the article

  • adwords traffic shows as 'not set' in google analytics

    - by sam
    in google analytics if i go to traffic sources search paid all i get is "not set". instead of the usual list of keywords that people have used to find the ad, this makes it really difficult to understand whats going on in the campaign.. ive gone into adwords and turned on auto tagging but still the same problem any idea how i can fix this so under the paid tab i get the phrases people have used to find my ad in ppc ?

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >