Search Results

Search found 4953 results on 199 pages for 'special treatment'.

Page 105/199 | < Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >

  • Create device receive SMS parse to text ( SMS Gateway )

    - by Chris Okyen
    I want to use a server as a device to run a script to parse a SMS text in the following way. I. The person types in a specific and special cell phone number (Similar to Facebook’s 32556 number used to post on your wall) II. The user types a text message. III. The user sends the text message. IV. The message is sent to some kind of Device (the server) or SMS Gateway and receives it. V. The thing described above that the message is sent to then parse the test message. I understand that these three question will mix Programming and Server Stuff and could reside here or at DBA.SE How would I make such a cell phone number (described in step I) that would be sent to the Device? How do I create the device that then would receive it? Finally, how do I Parse the text message? I don't want to pay for cloud space, server scripting stuff or server space; I want to just use a free webserver to do this totally free - meaning I will have to do more on my own... My question can be seen in more depth in this visual flowchart

    Read the article

  • Reading OpenDocument spreadsheets using C#

    - by DigiMortal
    Excel with its file formats is not the only spreadsheet application that is widely used. There are also users on Linux and Macs and often they are using OpenOffice and other open-source office packages that use ODF instead of OpenXML. In this post I will show you how to read Open Document spreadsheet in C#. Importer as example My previous post about importers showed you how to build flexible importers support to your web application. This post introduces you practical example of one of my importers. Of course, sensitive code is omitted. We start with ODS importer class and we add new methods as we go. public class OdsImporter : ImporterBase {     public OdsImporter()     {     }       public override string[] SupportedFileExtensions     {         get { return new[] { "ods" }; }     }       public override ImportResult Import(Stream fileStream, long companyId, short year)     {         string contentXml = GetContentXml(fileStream);           var result = new ImportResult();         var doc = XDocument.Parse(contentXml);           var rows = doc.Descendants("{urn:oasis:names:tc:opendocument:xmlns:table:1.0}table-row").Skip(1);           foreach (var row in rows)         {             ImportRow(row, companyId, year, result);         }           return result;     } } The class given here just extends base class for importers (previous post uses interface but as I already told there you move to abstract base class when writing code for real projects). Import method reads data from *.ods file, parses it (it is XML), finds all data rows and imports data. As you may see then first row is skipped. This is because the first row on my sheet is always headers row. Reading ODS file Our import method starts with getting XML from *.ods file. ODS files like OpenXml files are zipped containers that contain different files. We need content.xml as all data is kept there. To get the contents of file we use SharpZipLib library to read uploaded file as *.zip file. private static string GetContentXml(Stream fileStream) {     var contentXml = "";       using (var zipInputStream = new ZipInputStream(fileStream))     {         ZipEntry contentEntry = null;         while ((contentEntry = zipInputStream.GetNextEntry()) != null)         {             if (!contentEntry.IsFile)                 continue;             if (contentEntry.Name.ToLower() == "content.xml")                 break;         }           if (contentEntry.Name.ToLower() != "content.xml")         {             throw new Exception("Cannot find content.xml");         }           var bytesResult = new byte[] { };         var bytes = new byte[2000];         var i = 0;           while ((i = zipInputStream.Read(bytes, 0, bytes.Length)) != 0)         {             var arrayLength = bytesResult.Length;             Array.Resize<byte>(ref bytesResult, arrayLength + i);             Array.Copy(bytes, 0, bytesResult, arrayLength, i);         }         contentXml = Encoding.UTF8.GetString(bytesResult);     }     return contentXml; } If here is content.xml file then we stop browsing the file. We read this file to memory and return it as UTF-8 format string. Importing rows Our last task is to import rows. We use special method for this as we have to handle some tricks here. To keep files smaller the cell count on row is not always the same. If we have more than one empty cell one after another then ODS keeps only one cell for sequential empty cells. This cell has attribute called number-columns-repeated and it’s value is set to the number of sequential empty cells. This is why we use two indexers for cells collection. private void ImportRow(XElement row, ImportResult result) {     var cells = (from c in row.Descendants()                 where c.Name == "{urn:oasis:names:tc:opendocument:xmlns:table:1.0}table-cell"                 select c).ToList();       var dto = new DataDto();       var count = cells.Count;     var j = -1;       for (var i = 0; i < count; i++)     {         j++;         var cell = cells[i];         var attr = cell.Attribute("{urn:oasis:names:tc:opendocument:xmlns:table:1.0}number-columns-repeated");         if (attr != null)         {             var numToSkip = 0;             if (int.TryParse(attr.Value, out numToSkip))             {                 j += numToSkip - 1;             }         }           if (i > 30) break;         if (j == 0)         {             dto.SomeProperty = cells[i].Value;         }         if (j == 1)         {             dto.SomeOtherProperty = cells[i].Value;         }         // some more data reading     }       // save data } You can define your own class for import results and add there all problems found during data import. Your application gets the results and shows them to user. Conclusion Reading ODS files may seem to complex task but actually it is very easy if we need only data from those documents. We can use some zip-library to get the content file and then parse it to XML. It is not hard to go through the XML but there are some optimization tricks we have to know. The code here is safe to use in web applications as it is not using any API-s that may have special needs to server and infrastructure.

    Read the article

  • Tell me why I should bother using Linux if it's all about problems getting the OS to install or work properly? [closed]

    - by Vilhjalmur Magnussin
    Why should I spend day's trying to get Ubuntu to either install and/or work properly? I'm using an Acer Timeline X laptop and if I install 10.04 the wireless doesn't work, and if I try installing 11.04 it either won't install, or if it installs it's full of bugs causing my computer to freeze all the time. So please, I'm all open ears. Someone give me one or two good reasons to continue wasting time (in hope it eventually works) before I decide to focus my time on other things like productivity (using Windows like I've been doing successfully the last 10 years). This is the second time I give Ubuntu a try, the first time was in 2010 using Ubuntu Studio and Ubuntu Desktop, and it ended with me shifting back to Windows since I had spent more time getting everything to work than actually working while trying Ubuntu. I really don't understand why it needs to be like this. Why go on trying when all I see is forums full of discussions about problems which people are having difficulties fixing. Or maybe there is just one special type of computer which works well with Linux? Would very much like to know which computer that is. SO please, if it's not to much trouble I really want to here from someone who has something good to say about going through all this trouble just to get a working environment up and running since I already have a working environment up and running called Windows. Thanks, Villi.

    Read the article

  • Prevent Click Fraud in Advertisement system with PHP and Javascript

    - by CodeDevelopr
    I would like to build an Advertising project with PHP, MySQL, and Javascript. I am talking about something like... Google Adsense BuySellAds.com Any other advertising platform My question is mainly, what do I need to look out for to prevent people cheating the system and any other issues I may encounter? My design concept. An Advertisement is a record in the Database, when a page is loaded, using Javascript, it calls my server which in turn will use a PHP script to query the Database and get a random Advertisement. (It may do kore like get an ad based on demographics or other criteria as well) The PHP script will then return the Advertisement to the server/website that is calling it and show it on the page as an Image that will have a special tracking link. I will need to... Count all impressions (when the Advertisement is shown on the page) Count all clicks on the Advertisement link Count all Unique clicks on the Advertisement link My question is purely on the query and displaying of the Advertisement and nothing to do with the administration side. If there is ever money involved with my Advertisement buying/selling of adspace, then the stats need to be accurate and make sure people can't easily cheat the system. Is tracking IP address really the only way to try to prevent click fraud? I am hoping someone with some experience can clarify I am on the right track? As well as give me any advice, tips, or anything else I should know about doing something like this?

    Read the article

  • Red Samurai Performance Audit Tool – OOW 2013 release (v 1.1)

    - by JuergenKress
    We are running our Red Samurai Performance Audit tool and monitoring ADF performance in various projects already for about one year and the half. It helps us a lot to understand ADF performance bottlenecks and tune slow ADF BC View Objects or optimise large ADF BC fetches from DB. There is special update implemented for OOW'13 - advanced ADF BC statistics are collected directly from your application ADF BC runtime and later displayed as graphical information in the dashboard. I will be attending OOW'13 in San Francisco, feel free to stop me and ask about this tool - I will be happy to give it away and explain how to use it in your project. Original audit screen with ADF BC performance issues, this is part of our Audit console application: Audit console v1.1 is improved with one more tab - Statistics. This tab displays all SQL Selects statements produced by ADF BC over time, logged users, AM access load distribution and number of AM activations along with user sessions. Available graphs: Daily Queries  - total number of SQL selects per day Hourly Queries - Last 48 Hours Logged Users - total number of user sessions per day SQL Selects per Application Module - workload per Application Module Number of Activations and User sessions - last 48 hours - displays stress load Read the complete article here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: Red Samurai,ADF performance,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • Fresh install on SSD with Ubuntu and Windows Vista, using whole disk encryption for Ubuntu

    - by nategator
    I would like to do a fresh install on a OCZ Vertex Plus R2 SSD 60GB drive I purchased on the cheap. Since the AES-encryption looks like it may not work optimally for this drive, I would like to set up a dual-boot to Windows Vista (the only Windows copy I have for clean install purposes) and Ubuntu 12.04 with the best encryption scheme possible. My plan is to have Windows around just in case I need to use a program that won't work with Wine and Ubuntu as my daily OS with all of my information secured in case the laptop is ever stolen or sold. Although this setup will not provide a lot of space, I think I can squeeze both OSes and have enough for second-computer office tasks. So, my questions are: Which OS should I install first, Ubuntu or Vista? Any special considerations when partitioning the drive? How should I install Ubuntu to ensure full disk encryption for the Linux partition(s) and or my daily computing? Is there a significant performance upgrade with doing a solo install of Ubuntu instead of a dual boot setup? Will TRIM, for example, work correctly? Are there any significant security concerns with going the route of a dual-boot, other than the fact that any activity on Windows may be fully recoverable if the drive is stolen or sold? Thanks in advance!

    Read the article

  • December 2012 OTN Member Offers

    - by Cassandra Clark - OTN
    Our partners have answered the special offer call just in time for you to either shop for the tech professional in your life or share the list below with someone who keeps asking you what you want for the holidays.  Go right to the Oracle Technology Network Member Discount Page or read on for more details. Oracle  Store has extended their 10% Savings through December 31st 2012.Oracle Press - Oracle Technology Network members get 40% off the latest Oracle Press book by Oracle ACE Directors Ben Prusinski and Gustavo Gonzalez, Oracle E-Business Suite Financials Handbook, Third Edition in print and ebook format. CRC Press - Has added 3 NEW titles!  Get 20% off the below title at checkout. Secure Java: For Web Application Development Open Source Data Warehousing and Business Intelligence Developing Essbase Applications: Advanced Techniques for Finance and IT Professional Oracle Embedded Programming and Application Development Packt Publishing - Get 25% off the print books and get 35% off the eBooks listed below. You will need to be logged in for the discounts to apply at checkout and codes expire December 31st 2012. Getting Started with Oracle Data Integrator 11g: A Hands-On Tutorial Oracle Business Intelligence Enterprise Edition 11g: A Hands-On Tutorial Oracle Certified Associate, Java SE 7 Programmer Study Guide Safari Online-  Give the Gift of Knowledge This Holiday SeasonGive your friends and colleagues the gift of Safari Books Online! With an ever-expanding library of books and videos from more than 100 publishers (including Oracle Press), a subscription to Safari Books Online is the gift that always fits, helping your friends learn new skills and stay current. Starting at $42.99, gift subscriptions are available for 1, 3, 6 and 12 months. Get all of this and more at the Oracle Technology Network Member Discount Page!

    Read the article

  • Oracle ADF and Simplified UI Apps: I18n Feng Shui on Display

    - by ultan o'broin
    I demoed the Hebrew language version of Oracle Sales Cloud Release 8 live in Israel recently. The crowd was yet again wowed by the simplified UI (SUI). I’ve now spent some time playing around with most of the 23 language versions, or the NLS (Natural Language Support) versions as we’d call them, available in Release 8. Hebrew Oracle Sales Cloud Release 8 The simplified UI is built using 100% Oracle ADF. This framework is a great solution for developers to productively build tablet-first, mobility-driven apps for users who work and live using natural languages other than English. Oracle ADF’s internationalization (i18n) relies on built-in Java and Unicode,  packing in i18n goodness such as Bi-Di (or bi-directional) flipping of pages, locale-enabled resource bundles, date and time support, and so on. Comparing German (left) and Hebrew Bi-Di (right) page components in the simplified UI. Note the change in the direction of the arrows and positions of the text. So, developers who need to build global apps don’t have to do anything special when using Oracle ADF components, all thanks to the baked-in UX Feng Shui, as Grant Ronald of the ADF team would say to the UK Oracle User Group. Find out more  about  ADF i18n from Frédéric Desbiens (@blueberrycoder)  on the ADF Architecture TV channel.

    Read the article

  • Two Copies of "Silverlight 5 In Action" to Give Away and a FREE chapter!

    - by Dave Campbell
    I know most of you have seen my post from Tuesday where I talked about giving away 2 copies of Pete's book on Monday morning July 18th. Well... I'm repeating it, because it's a smoking deal... for the cost of an email you too can take a shot at getting Pete's latest released "Silverlight 5 In Action" free. 2 Important Pieces of Information 1) The deadline: midnight Sunday night, July 17, 2012, Arizona time... if you know me, you know I've lived here too long and am timezone stupid... so don't make me calculate it out :) 2) The how: I have a special email address for submittals: mailto:[email protected]?Subject=Giveaway. 3) oh yeah... I lied about only 2 pieces of info... number 3 ... there may be other surprises on Monday morning... 'nuff said 4) and just to pump up the volume on the book... how about a Free chapter you can read right here on Working with RSS and Atom! 5) send me an email and Stay in the 'Light!

    Read the article

  • ADF Faces Skin Editor - How to Work with It

    - by Shay Shmeltzer
    The ODTUG Kscop11 conference was a great success with lots of sessions about FMW running in a special track. I did several sessions and labs in the conference, and I thought it might be a good idea to at least give you a taste of what you might have missed. So here is most of what I demoed in my ADF Faces Skinning session (not all though - that session was 60 minutes long, and while everyone did end up going out of the building in the middle because of a fire drill for about 5 minutes, there was other things covered in the session as well). In the demo here you'll see how to generate new images and default color scheme, how to identify a component class with Firebug, how to skin a component, how to identify the global selector of a property, how to change fonts and how to change strings. By the way, for more on ADF Skinning you should also listen to the ADF Insider seminar that Frank Nimphius recorded on skinning, it will give you better understanding of the overall skinning process. P.S. in the demo I add an entry to the web.xml file which prevent ADF Faces from compressing the HTML that is generated. The entry is for org.apache.myfaces.trinidad.DISABLE_CONTENT_COMPRESSION  and I set it to true. This is very useful when you work on creating the skin, but don't forget to un-set it before you go production.

    Read the article

  • Sharing My Thoughts on Space Flight

    - by Grant Fritchey
    This went out in the DBA newsletter from Red Gate, but I enjoyed writing it so much, I thought I'd share it to a wider audience: I grew up watching the US space program. I watched men walk on the moon for the first time in 1969, when I was only six years old. From that moment on, I dreamed of going into space. I studied aeronautics and tried to get into the Air Force Academy, all in preparation for my long career as an astronaut. Clearly, that didn't quite work out for me. But it sure could for you. At Red Gate, we're running a new contest: DBA in Space. The prize is a sub-orbital flight. When I first got word of this contest, my immediate response was, "And you need me to go right away and do a test flight? Excellent!" No, no test flight needed, plus I was pretty low on the list of volunteers. "That's OK, I'll just enter." Then I was told that, as a Red Gate employee, I couldn't win. My next response was, "I quit".eventually, I was talked down off the ledge, and agreed to help make this special for some other DBA. Many (most?) of us are science fiction fans, either the soft science of Star Trek and Star Wars, or the hard science of Niven and Pournelle, or Allen Steele. We watched the Shuttles go up and land. We've been dreaming of our own trips into orbit and our vacation-home on the Moon for a long, long time. All that might not arrive on schedule, but you've got a shot at breaking clear of the atmosphere. The first stage is a video quiz, starring Brad McGehee, and it's live at www.DBAinSpace.com now. Go for it. Good luck and God speed!

    Read the article

  • external display in Ubuntu 13.04

    - by thuc2009
    I have had a look on series of similar topics but I could not find a right answer. Currently, I bought laptop Dell latitude E6530 and installed Ubuntu 13.04 (64 bit) alongside with Windows 7 (I could not get rid of it because of some special programs for work). Ubuntu worked well, however, it did not detect the external displays (Dell 23" and Dell 19") no matter I plugged them before or after booting. Edited: I connected my laptop to those external displays through a laptop dock. I went to Settings- Display, the displays were there, but when I enabled them, Ubuntu logged off automatically. No external display was detected. In addition, the laptop screen sometimes was detected as Dell 23" I got an advice from similar topic that nvidia driver should be updated to version 313 or 319. I upgraded driver and the unity launcher disappeared, main display resolution turned to 640*480 and none of external displays was detected. I tried to use additional drivers, but when I turned it on, it told me that there was no driver to be used. Could some one tell me what should I do ? Edited : options vmwgfx enable_fbdev=1

    Read the article

  • Need advice on framework design: how to make extending easy

    - by beginner_
    I'm creating a framework/library for a rather specific use-case (data type). It uses diverse spring components, including spring-data. The library has a set of entity classes properly set up and according service and dao layers. The main work or main benefit of the framework lies in the dao and service layer. Developers using the framework should be able to extend my entity classes to add additional fields they require. Therefore I made dao and service layer generic so it can be used by such extended entity classes. I now face an issue in the IO part of the framework. It must be able to import the according "special data type" into the database. In this part I need to create a new entity instance and hence need the actual class used. My current solution is to configure in spring a bean of the actual class used. The problem with this is that an application using the framework could only use 1 implementation of the entity (the original one from me or exactly 1 subclass but not 2 different classes of the same hierarchy. I'm looking for suggestions / desgins for solving this issue. Any ideas?

    Read the article

  • How would I batch rename a lot of files using command-line?

    - by Whisperity
    I have a problem which I am unable to solve: I need to rename a great dump of files using patterns. I tried using this, but I always get an error. I have a folder, inside with a lot of files. Running ls -1 | wc -l, it returns that I have like 160000 files inside. The problem is, that I wish to move these files to a Windows system, but most of them have characters like : and ? in them, which makes the file unaccessible on said Windows-based systems. (As a "do not solve but deal with" method, I tried booting up a LiveCD on the Windows system and moving the files using the live OS. Under that Ubuntu, the files were readable and writable on the mounted NTFS partition, but when I booted back on Windows, it showed that the file is there but Windows was unable to access it in any fashion: rename, delete or open.) I tried running rename 's/\:/_' * inside the folder, but I got Argument list too long error. Some search revealed that it happens because I have so many files, and then I arrived here. The problem is that I don't know how to alter the command to suit my needs, as I always end up having various errors like Trying find -name '*:*' | xargs rename : _, it gives xargs: unmatched single quote; by default quotes are special to xargs unless you use the -0 option [\n] syntax error at (eval 1) line 1, near ":" [\n] xargs: rename: exited with status 255; aborting Adding the -0 after xargs turns the error message to xargs: argument line too long These files are archive files generated by various PHP scripts. The best solution would be having a chance to rename them before they are moved to Windows, but if there is no way to do it, we might have a way to rename the files while they are moved to Windows. I use samba and proftpd to move the files. Unfortunately, graphical software are out of the question as the server containing the files is what it is, a server, with only command-line interface.

    Read the article

  • Cocos2d v2.0 and OpenGL 2.0/1.0: where to start

    - by mm24
    I started developing my very first game 3 months ago using Cocos2d 2.0 for iPhone. I am now in the stage where I'd like to add some cool effects to the bullets and some special weapons (see my waveforms question here). I got a good answer in the cocos2d-iphone forum (see this one). Unfortunately I am a bit paralized now. I don't know if I will be overdoing by learning OpengGL 2.0 or if I should just stick ot the old 1.0. There is a good intro on various tutorial's written in Steffen Itterheims blog (see this post). I would like to add to my game: a blur effect to the bullets (here is a tutorial for OpenGL 1.0) a waveform (see above) some realistic water ripples (here is a nice sample code) So now, given that I don't want to overdo things but at the same time I want to achieve those effects, from where should I start? Should I discard the OpenGL 1.0 tutorials? OR should I use only OpenGL 1.0 code? How can I avoid confusion? I mean, it seems that the compiler recognizes both, but that there are some conflictual calls in some circumnstances, I am fairly sure this has some explanation, is there some reference to this somewhere?

    Read the article

  • Enterprise Cloud Infrastructure for Dummies eBook

    - by ferhat
    Are you considering "going to the cloud" as a way to cut IT costs and maximize your virtualization investments? Then Enterprise Cloud Infrastructure for Dummies is a no-nonsense guide to help you navigate this hot topic. This user friendly guide explains how to cut through the noise and take advantage of integrated virtualization and management tools to implement a cloud infrastructure that not only lowers operational costs but that can easily adapt and scale to run a broad range of application services safely and securely. &amp;amp;<span id="XinhaEditingPostion"></span>amp;amp;amp;amp;amp;lt;span id=&amp;amp;amp;amp;amp;amp;amp;quot;XinhaEditingPostion&amp;amp;amp;amp;amp;amp;amp;quot;&amp;amp;amp;amp;amp;amp;amp;gt;&amp;amp;amp;amp;amp;amp;amp;lt;/span&amp;amp;amp;amp;amp;amp;amp;gt; This e-book will serve as a valuable Cloud computing guide covering important topics such as: The current overall cloud landscape and how to best leverage private cloud infrastructure How to build an effective Enterprise Cloud Infrastructure using the Oracle Optimized Solution methodology Quantifiable costs savings gained using Oracle's integrated hardware and software and Optimized Solutions Download your exclusive copy of Enterprise Cloud Infrastructure, Oracle Special Edition today.

    Read the article

  • What's wrong with cplusplus.com?

    - by Kerrek SB
    This is perhaps not a perfectly suitable forum for this question, but let me give it a shot, at the risk of being moved away. There are several references for the C++ standard library, including the invaluable ISO standard, MSDN, IBM, cppreference, and cplusplus. Personally, when writing C++ I need a reference that has quick random access, short load times and usage examples, and I've been finding cplusplus.com pretty useful. However, I've been hearing negative opinions about that website frequently here on SO, so I would like to get specific: What are the errors, misconceptions or bad pieces of advice given by cplusplus.com? What are the risks of using it to make coding decisions? Let me add this point: I want to be able to answer questions here on SO with accurate quotes of the standard, and thus I would like to post immediately-usable links, and cplusplus.com would have been my choice site were it not for this issue. Update: There have been many great responses, and I have seriously changed my view on cplusplus.com. I'd like to list a few choice results here; feel free to suggest more (and keep posting answers). As of June 29, 2011: Incorrect description of some algorithms (e.g. remove). Information about the behaviour of functions is sometimes incorrect (atoi), fails to mention special cases (strncpy), or omits vital information (iterator invalidation). Examples contain deprecated code (#include style). Inexact terminology is doing a disservice to learners and the general community ("STL", "compiler" vs "toolchain"). Incorrect and misleading description of the typeid keyword.

    Read the article

  • What You Said: How You Share Your Photos

    - by Jason Fitzpatrick
    Earlier this week we asked you to share your favorite tips, tricks, and tools for sharing photos with friends and family. Now we’re back to highlight the ways HTG readers share their pics. Image available as wallpaper here. By far the most popular method of photo sharing was to upload the pictures to cloud-based storage. Many readers took advantage of sizable SkyDrive accounts. Dragonbite writes: I used to use PicasaWeb (uploaded from Shotwell) until I got the SkyDrive w/25 GB available. My imported pictures are automatically synchronized with SkyDrive and I then send out a link to whomever I want. I have another (desktop) computer where all of the pictures are stored from mine and my wife’s camera’s imports so if I need to free up some space on SkyDrive or my Windows 7 laptop, I double-check they are in the desktop computer before deleting them from my laptop (and thus from SkyDrive as well). I wish SkyDrive enabled some features like rotate, or searching by Tagged person. 8 Deadly Commands You Should Never Run on Linux 14 Special Google Searches That Show Instant Answers How To Create a Customized Windows 7 Installation Disc With Integrated Updates

    Read the article

  • A Fresh Start

    - by Laila
    As you may already be aware, I'm no longer responsible for the .NET Reflector newsletter. That publication is now in the very capable hands of the Reflector team. But fear not; starting in early April, I'll be launching a brand new .NET Newsletter, and I invite you to enjoy the very first edition by subscribing to our new mailing list, or by updating your Simple-Talk subscriptions, and joining the .NET Newsletter mailing list. With a fresh and snappy design (it might even be described as idiosyncratic. but I can say no more at this stage), we'll be making a brand new start. Each month, a member of my team (that's the Red Gate .NET team) will host the .NET Newsletter, bringing you the choicest cuts of breaking news, the very best .NET content from Simple-Talk, alongside details of hot upcoming events. To top it off, not only will you be among the first to get access to free resources (including free wall-charts, training videos and eBooks), but you'll also get exclusive access to betas, early access programs, and special offers. We can't wait to share the new design and exciting new content with you! If you have any questions about the changes to the newsletter, please feel free to send an email to [email protected] or post a comment on my blog. If I don't hear from you before next month, then I'll simply say that I hope you enjoy the new look. Cheers, Laila

    Read the article

  • accessing live usb files from new hd ubuntu install

    - by Robin Bailey
    After my live USB (ubuntu 12.04 lts) refused to boot, I proceeded to install the same Ubuntu version on the laptop hard drive (a dual boot next to Win xp). This all went well without a hitch. Previous to this, I spent several weeks enjoying and exploring ubuntu from the usb pendrive. During this time I changed lots of settings and customized Firefox and more. Now, I'd like to import the home folder from the usb drive into the new install home folder on the hard disk, which is the purported folder that holds all those special settings to my knowledge. Unfortunately and only being familiar with Windows file systems, the view of the usb file system from the new hdd install is totally perplexing. I can't find anything that looks anywhere close to the original file system. More, I can't find any of the files I had created and stored there, like the LibreOfficeCalc file that has all my passwords (this one is really discouraging) that was stored on the ubuntu desktop. Help me find this file alone and I'll bow down with full apologies to any and all computer gods. Being able to import all those customizing settings into the new install would be a major bonus also, but hey, I'm not greedy. I'll take the passwords file and be happy! And humble! I would be very grateful for some clear, understandable help on this. Thanks

    Read the article

  • Handling Types for Real and Complex Matrices in a BLAS Wrapper

    - by mga
    I come from a C background and I'm now learning OOP with C++. As an exercise (so please don't just say "this already exists"), I want to implement a wrapper for BLAS that will let the user write matrix algebra in an intuitive way (e.g. similar to MATLAB) e.g.: A = B*C*D.Inverse() + E.Transpose(); My problem is how to go about dealing with real (R) and complex (C) matrices, because of C++'s "curse" of letting you do the same thing in N different ways. I do have a clear idea of what it should look like to the user: s/he should be able to define the two separately, but operations would return a type depending on the types of the operands (R*R = R, C*C = C, R*C = C*R = C). Additionally R can be cast into C and vice versa (just by setting the imaginary parts to 0). I have considered the following options: As a real number is a special case of a complex number, inherit CMatrix from RMatrix. I quickly dismissed this as the two would have to return different types for the same getter function. Inherit RMatrix and CMatrix from Matrix. However, I can't really think of any common code that would go into Matrix (because of the different return types). Templates. Declare Matrix<T> and declare the getter function as T Get(int i, int j), and operator functions as Matrix *(Matrix RHS). Then specialize Matrix<double> and Matrix<complex>, and overload the functions. Then I couldn't really see what I would gain with templates, so why not just define RMatrix and CMatrix separately from each other, and then overload functions as necessary? Although this last option makes sense to me, there's an annoying voice inside my head saying this is not elegant, because the two are clearly related. Perhaps I'm missing an appropriate design pattern? So I guess what I'm looking for is either absolution for doing this, or advice on how to do better.

    Read the article

  • Java enum pairs / "subenum" or what exactly?

    - by vemalsar
    I have an RPG-style Item class and I stored the type of the item in enum (itemType.sword). I want to store subtype too (itemSubtype.long), but I want to express the relation between two data type (sword can be long, short etc. but shield can't be long or short, only round, tower etc). I know this is wrong source code but similar what I want: enum type { sword; } //not valid code! enum swordSubtype extends type.sword { short, long } Question: How can I define this connection between two data type (or more exactly: two value of the data types), what is the most simple and standard way? Array-like data with all valid (itemType,itemSubtype) enum pairs or (itemType,itemSubtype[]) so more subtype for one type, it would be the best. OK but how can I construct this simplest way? Special enum with "subenum" set or second level enum or anything else if it does exists 2 dimensional "canBePairs" array, itemType and itemSubtype dimensions with all type and subtype and boolean elements, "true" means itemType (first dimension) and itemSubtype (second dimension) are okay, "false" means not okay Other better idea Thank you very much!

    Read the article

  • Should I, and how do I incorporate microdata into my asp.net website with 47 pages?

    - by Jason Weber
    I have an asp.net (vb) with 47 pages. The problem is that it's in 10 different languages, although 98% just use English. I have 5 master pages. I've read Google Webmaster Tools, but I'm still confounded. I'm reading about how microdata is the way to go. Does this mean I should put itemtype and itemprop span and div tags in my master pages, or should I do all of my 47 pages (.resx resource files) separately? The main key phrase I want throughout search results is "machine vision". For instance, the first couple sentences on my "about.aspx" page are: <span itemprop="name">USS Vision Inc.</span> (USS) is a privately-owned company with headquarters in <span itemprop="locality">Detroit, Michigan, USA</span>. We design, engineer, produce, and integrate special machine vision error-proofing products and <a href="http://www.ussvision.com/services/" target="_self" itemprop="url">services</a> that create lean factories by improving the quality of manufactured products, and by significantly reducing manufacturing costs through advanced automation. Am I doing this right, or how would I do this if I'm not? Should I use the itemprop="url" or other rich snippets for every link in my website? I mean, do I need to add an itemprop to just about everything, or can I just alter my master pages? Any guidance in this regard to help improve my SEO and SERPS would be greatly appreciated!

    Read the article

  • How to sell logistical procedures that require less time to perform but more finesse?

    - by foampile
    I am working with a group where part of the responsibilities is managing a certain set of configuration files which, of course, have the same skeleton/structure across different environments but different values (like server, user, this setting, that setting etc.). Pretty classic scenario... The problem is that everyone just goes and modifies final, environment-specific files and basically repeats the work for every environment. Personally, I am offended to have to peform repeatable, mundane tasks in this day and age when we have technologies to automate it all. So I devised a very simple procedure of abstracting the files into templates, stubbing env-specific values with parameters and then wrote a simple Perl script that, given a template and an environment matrix with env-specific values for each param, produces the final file. So this is nothing special, cutting-edge or revolutionary -- I am pretty sure that 20 years ago efficient places did their CM like that. However, that requires that changes are made at the template level and then distributed across different environments using the script and not making changes in the final environment-specific files. This is where I am encountering resentment as they feel "comfortable" doing it their old, manual, repeated labor way. Personally, I don't have a problem with them working hard rather than smart but the problem is when I have to build on top of someone else's changes, I have to merge their changes into my template from a specific file, which takes time and is grueling. So my question is how to go about selling my method, which makes it so much faster in an environment that is resentful to change and where most things have to be done at the level of the least competent team member?

    Read the article

  • Understanding IDAT chunk of PNG file format

    - by DRapp
    From the sample image below, I have a border in yellow just for display purposes only. The actual .png file is a simple black/white image 3 pixels by 3 pixels. I was originally thinking to try as a 2x2, but that would not help trying to interpret low/hi vs hi/low drawing stream. At least this way, I would have two black, one white from the top, or one white, two black from the bottom.. So I read the chunks of data, get to the IDAT chunk, decode that (zlib) and come up with 12 bytes as follows 00 20 00 40 00 80 So, my question, how does the above get broken down into the 3x3 black and white sample... Also, it is saved in palette format and properly recognizes the bit depth of 1 and color palette of 2... color pallet[0] is RGBA all zeros. Palette1 has RGBA of 255, 255, 255, 0 I'll eventually get into the multiple other depth formats later, just wanted to start with what would expect to be the easiest. Part II. Any guidance on handling the other depth formats would help if anything special to be considered especially regarding alpha channel (which I am already looking for in the palette) that might trip me up.

    Read the article

< Previous Page | 101 102 103 104 105 106 107 108 109 110 111 112  | Next Page >