Search Results

Search found 20281 results on 812 pages for 'software engineer'.

Page 438/812 | < Previous Page | 434 435 436 437 438 439 440 441 442 443 444 445  | Next Page >

  • How to install a Logitech c310 webcam?

    - by Teja
    I bought a new Logitech C310 webcam.i dont know how to install webcam on ubuntu.am trying to install this software through the terminal window. wine /media/LWS_2_0/Setup.exe but its showing as fixme:ole:TLB_ReadTypeLib Header type magic 0x00905a4d not supported. err:ole:TLB_ReadTypeLib Loading of typelib L"Z:\\media\\LWS_2_1\\MSetup.exe" failed with error 0 and its opening a window with data as "We have detected you have connected your webcam to a USB1.1 port,for the best performance and full feature set,we suggest using a USB2.0 port" I tried this all the USB ports available in my system.its showing the same error. Please tell me how to install this webcam on ubuntu 11.10. Thanx for reading.

    Read the article

  • Understanding hand written lexers

    - by Cole Johnson
    I am going to make a compiler for C (C99; I own the standards PDF), written in C (go figure) and looking up on how compilers work on Wikipedia has told me a lot. However, after reading up on lexers has confused me. The Wikipedia page states that: the GNU Compiler Collection (gcc) uses hand-written lexers I have tried googling what a hand written lexer and have come up with nothing except for "making a flowchart that describes how it should function", however, isn't that how all software development should be done? So my question is: "What is a hand written lexer?"

    Read the article

  • How do I improve my code reading skills

    - by Andrew
    Well the question is in the title - how do I improve my code reading skills. The software/hardware environment I currently do development in is quite slow with respect to compilation times and time it takes the whole system to test. The system is quite old/complex and thus splitting it into a several smaller, more manageable sub-projects is not feasible in a neare future. I have realized is what really hinders the development progress is my code reading skills. How do I improve my code reading skills, so I can spot most of the errors and issues in the code even before I hit the "do compile" key, even before I start the debugger?

    Read the article

  • How can a student programmer improve his teamwork skill?

    - by xiao
    I am a student right now. Recently, I am working in a project as a leader with three other students. Due to the lack of experience, our project is progressing slowly and our members are frustrated. They do not feel sense of accomplishment in the project. I am pressured and frustrated, too. But as a team leader, I think I need to push them. But I do not know how to do. Do I help them solve coding problem or just encouragement? But if I pay too much attention on it, it would slow down my own progress. It is a not technical question, but it is very common in software development. I hope veteran programmers would give me some suggestions. Thanks!

    Read the article

  • Informed TDD &ndash; Kata &ldquo;To Roman Numerals&rdquo;

    - by Ralf Westphal
    Originally posted on: http://geekswithblogs.net/theArchitectsNapkin/archive/2014/05/28/informed-tdd-ndash-kata-ldquoto-roman-numeralsrdquo.aspxIn a comment on my article on what I call Informed TDD (ITDD) reader gustav asked how this approach would apply to the kata “To Roman Numerals”. And whether ITDD wasn´t a violation of TDD´s principle of leaving out “advanced topics like mocks”. I like to respond with this article to his questions. There´s more to say than fits into a commentary. Mocks and TDD I don´t see in how far TDD is avoiding or opposed to mocks. TDD and mocks are orthogonal. TDD is about pocess, mocks are about structure and costs. Maybe by moving forward in tiny red+green+refactor steps less need arises for mocks. But then… if the functionality you need to implement requires “expensive” resource access you can´t avoid using mocks. Because you don´t want to constantly run all your tests against the real resource. True, in ITDD mocks seem to be in almost inflationary use. That´s not what you usually see in TDD demonstrations. However, there´s a reason for that as I tried to explain. I don´t use mocks as proxies for “expensive” resource. Rather they are stand-ins for functionality not yet implemented. They allow me to get a test green on a high level of abstraction. That way I can move forward in a top-down fashion. But if you think of mocks as “advanced” or if you don´t want to use a tool like JustMock, then you don´t need to use mocks. You just need to stand the sight of red tests for a little longer ;-) Let me show you what I mean by that by doing a kata. ITDD for “To Roman Numerals” gustav asked for the kata “To Roman Numerals”. I won´t explain the requirements again. You can find descriptions and TDD demonstrations all over the internet, like this one from Corey Haines. Now here is, how I would do this kata differently. 1. Analyse A demonstration of TDD should never skip the analysis phase. It should be made explicit. The requirements should be formalized and acceptance test cases should be compiled. “Formalization” in this case to me means describing the API of the required functionality. “[D]esign a program to work with Roman numerals” like written in this “requirement document” is not enough to start software development. Coding should only begin, if the interface between the “system under development” and its context is clear. If this interface is not readily recognizable from the requirements, it has to be developed first. Exploration of interface alternatives might be in order. It might be necessary to show several interface mock-ups to the customer – even if that´s you fellow developer. Designing the interface is a task of it´s own. It should not be mixed with implementing the required functionality behind the interface. Unfortunately, though, this happens quite often in TDD demonstrations. TDD is used to explore the API and implement it at the same time. To me that´s a violation of the Single Responsibility Principle (SRP) which not only should hold for software functional units but also for tasks or activities. In the case of this kata the API fortunately is obvious. Just one function is needed: string ToRoman(int arabic). And it lives in a class ArabicRomanConversions. Now what about acceptance test cases? There are hardly any stated in the kata descriptions. Roman numerals are explained, but no specific test cases from the point of view of a customer. So I just “invent” some acceptance test cases by picking roman numerals from a wikipedia article. They are supposed to be just “typical examples” without special meaning. Given the acceptance test cases I then try to develop an understanding of the problem domain. I´ll spare you that. The domain is trivial and is explain in almost all kata descriptions. How roman numerals are built is not difficult to understand. What´s more difficult, though, might be to find an efficient solution to convert into them automatically. 2. Solve The usual TDD demonstration skips a solution finding phase. Like the interface exploration it´s mixed in with the implementation. But I don´t think this is how it should be done. I even think this is not how it really works for the people demonstrating TDD. They´re simplifying their true software development process because they want to show a streamlined TDD process. I doubt this is helping anybody. Before you code you better have a plan what to code. This does not mean you have to do “Big Design Up-Front”. It just means: Have a clear picture of the logical solution in your head before you start to build a physical solution (code). Evidently such a solution can only be as good as your understanding of the problem. If that´s limited your solution will be limited, too. Fortunately, in the case of this kata your understanding does not need to be limited. Thus the logical solution does not need to be limited or preliminary or tentative. That does not mean you need to know every line of code in advance. It just means you know the rough structure of your implementation beforehand. Because it should mirror the process described by the logical or conceptual solution. Here´s my solution approach: The arabic “encoding” of numbers represents them as an ordered set of powers of 10. Each digit is a factor to multiply a power of ten with. The “encoding” 123 is the short form for a set like this: {1*10^2, 2*10^1, 3*10^0}. And the number is the sum of the set members. The roman “encoding” is different. There is no base (like 10 for arabic numbers), there are just digits of different value, and they have to be written in descending order. The “encoding” XVI is short for [10, 5, 1]. And the number is still the sum of the members of this list. The roman “encoding” thus is simpler than the arabic. Each “digit” can be taken at face value. No multiplication with a base required. But what about IV which looks like a contradiction to the above rule? It is not – if you accept roman “digits” not to be limited to be single characters only. Usually I, V, X, L, C, D, M are viewed as “digits”, and IV, IX etc. are viewed as nuisances preventing a simple solution. All looks different, though, once IV, IX etc. are taken as “digits”. Then MCMLIV is just a sum: M+CM+L+IV which is 1000+900+50+4. Whereas before it would have been understood as M-C+M+L-I+V – which is more difficult because here some “digits” get subtracted. Here´s the list of roman “digits” with their values: {1, I}, {4, IV}, {5, V}, {9, IX}, {10, X}, {40, XL}, {50, L}, {90, XC}, {100, C}, {400, CD}, {500, D}, {900, CM}, {1000, M} Since I take IV, IX etc. as “digits” translating an arabic number becomes trivial. I just need to find the values of the roman “digits” making up the number, e.g. 1954 is made up of 1000, 900, 50, and 4. I call those “digits” factors. If I move from the highest factor (M=1000) to the lowest (I=1) then translation is a two phase process: Find all the factors Translate the factors found Compile the roman representation Translation is just a look-up. Finding, though, needs some calculation: Find the highest remaining factor fitting in the value Remember and subtract it from the value Repeat with remaining value and remaining factors Please note: This is just an algorithm. It´s not code, even though it might be close. Being so close to code in my solution approach is due to the triviality of the problem. In more realistic examples the conceptual solution would be on a higher level of abstraction. With this solution in hand I finally can do what TDD advocates: find and prioritize test cases. As I can see from the small process description above, there are two aspects to test: Test the translation Test the compilation Test finding the factors Testing the translation primarily means to check if the map of factors and digits is comprehensive. That´s simple, even though it might be tedious. Testing the compilation is trivial. Testing factor finding, though, is a tad more complicated. I can think of several steps: First check, if an arabic number equal to a factor is processed correctly (e.g. 1000=M). Then check if an arabic number consisting of two consecutive factors (e.g. 1900=[M,CM]) is processed correctly. Then check, if a number consisting of the same factor twice is processed correctly (e.g. 2000=[M,M]). Finally check, if an arabic number consisting of non-consecutive factors (e.g. 1400=[M,CD]) is processed correctly. I feel I can start an implementation now. If something becomes more complicated than expected I can slow down and repeat this process. 3. Implement First I write a test for the acceptance test cases. It´s red because there´s no implementation even of the API. That´s in conformance with “TDD lore”, I´d say: Next I implement the API: The acceptance test now is formally correct, but still red of course. This will not change even now that I zoom in. Because my goal is not to most quickly satisfy these tests, but to implement my solution in a stepwise manner. That I do by “faking” it: I just “assume” three functions to represent the transformation process of my solution: My hypothesis is that those three functions in conjunction produce correct results on the API-level. I just have to implement them correctly. That´s what I´m trying now – one by one. I start with a simple “detail function”: Translate(). And I start with all the test cases in the obvious equivalence partition: As you can see I dare to test a private method. Yes. That´s a white box test. But as you´ll see it won´t make my tests brittle. It serves a purpose right here and now: it lets me focus on getting one aspect of my solution right. Here´s the implementation to satisfy the test: It´s as simple as possible. Right how TDD wants me to do it: KISS. Now for the second equivalence partition: translating multiple factors. (It´a pattern: if you need to do something repeatedly separate the tests for doing it once and doing it multiple times.) In this partition I just need a single test case, I guess. Stepping up from a single translation to multiple translations is no rocket science: Usually I would have implemented the final code right away. Splitting it in two steps is just for “educational purposes” here. How small your implementation steps are is a matter of your programming competency. Some “see” the final code right away before their mental eye – others need to work their way towards it. Having two tests I find more important. Now for the next low hanging fruit: compilation. It´s even simpler than translation. A single test is enough, I guess. And normally I would not even have bothered to write that one, because the implementation is so simple. I don´t need to test .NET framework functionality. But again: if it serves the educational purpose… Finally the most complicated part of the solution: finding the factors. There are several equivalence partitions. But still I decide to write just a single test, since the structure of the test data is the same for all partitions: Again, I´m faking the implementation first: I focus on just the first test case. No looping yet. Faking lets me stay on a high level of abstraction. I can write down the implementation of the solution without bothering myself with details of how to actually accomplish the feat. That´s left for a drill down with a test of the fake function: There are two main equivalence partitions, I guess: either the first factor is appropriate or some next. The implementation seems easy. Both test cases are green. (Of course this only works on the premise that there´s always a matching factor. Which is the case since the smallest factor is 1.) And the first of the equivalence partitions on the higher level also is satisfied: Great, I can move on. Now for more than a single factor: Interestingly not just one test becomes green now, but all of them. Great! You might say, then I must have done not the simplest thing possible. And I would reply: I don´t care. I did the most obvious thing. But I also find this loop very simple. Even simpler than a recursion of which I had thought briefly during the problem solving phase. And by the way: Also the acceptance tests went green: Mission accomplished. At least functionality wise. Now I´ve to tidy up things a bit. TDD calls for refactoring. Not uch refactoring is needed, because I wrote the code in top-down fashion. I faked it until I made it. I endured red tests on higher levels while lower levels weren´t perfected yet. But this way I saved myself from refactoring tediousness. At the end, though, some refactoring is required. But maybe in a different way than you would expect. That´s why I rather call it “cleanup”. First I remove duplication. There are two places where factors are defined: in Translate() and in Find_factors(). So I factor the map out into a class constant. Which leads to a small conversion in Find_factors(): And now for the big cleanup: I remove all tests of private methods. They are scaffolding tests to me. They only have temporary value. They are brittle. Only acceptance tests need to remain. However, I carry over the single “digit” tests from Translate() to the acceptance test. I find them valuable to keep, since the other acceptance tests only exercise a subset of all roman “digits”. This then is my final test class: And this is the final production code: Test coverage as reported by NCrunch is 100%: Reflexion Is this the smallest possible code base for this kata? Sure not. You´ll find more concise solutions on the internet. But LOC are of relatively little concern – as long as I can understand the code quickly. So called “elegant” code, however, often is not easy to understand. The same goes for KISS code – especially if left unrefactored, as it is often the case. That´s why I progressed from requirements to final code the way I did. I first understood and solved the problem on a conceptual level. Then I implemented it top down according to my design. I also could have implemented it bottom-up, since I knew some bottom of the solution. That´s the leaves of the functional decomposition tree. Where things became fuzzy, since the design did not cover any more details as with Find_factors(), I repeated the process in the small, so to speak: fake some top level, endure red high level tests, while first solving a simpler problem. Using scaffolding tests (to be thrown away at the end) brought two advantages: Encapsulation of the implementation details was not compromised. Naturally private methods could stay private. I did not need to make them internal or public just to be able to test them. I was able to write focused tests for small aspects of the solution. No need to test everything through the solution root, the API. The bottom line thus for me is: Informed TDD produces cleaner code in a systematic way. It conforms to core principles of programming: Single Responsibility Principle and/or Separation of Concerns. Distinct roles in development – being a researcher, being an engineer, being a craftsman – are represented as different phases. First find what, what there is. Then devise a solution. Then code the solution, manifest the solution in code. Writing tests first is a good practice. But it should not be taken dogmatic. And above all it should not be overloaded with purposes. And finally: moving from top to bottom through a design produces refactored code right away. Clean code thus almost is inevitable – and not left to a refactoring step at the end which is skipped often for different reasons.   PS: Yes, I have done this kata several times. But that has only an impact on the time needed for phases 1 and 2. I won´t skip them because of that. And there are no shortcuts during implementation because of that.

    Read the article

  • Kinetic Scroll in Maverick

    - by Shoaib
    I have Sony VAIO FW 230J laptop. I was using Lucid and upgraded my system to Maverick beta in September 2010. Although there were few issues with beta release on my system although I did not felt myself in any deadlock. The most notable experience on my ubuntu was the smooth and kinetic scrolling and very sensitive touch pad experience with finer control that was oddly normal in previous releases and even on windows 7 currently. Definitely this release of ubuntu is improved for touch UI experience. I believe that this is magic of 10.10. For my official system I waited until the final release was out. After installing (upgrading) official system to Maverick I am not feeling the same smooth and kinetic scroll experience. My laptop has Graphics: Intel DMA-X4500MHD while my official desktop has Graphics: Intel DMA-X4500HD Do I need to install or update some software package to have similar kinetic scroll experience?

    Read the article

  • Cyclic Dependencies.

    - by PhilCK
    Are cyclic dependencies a common thing in games dev? I ask as I keep getting into situation where I'm using and have been told more than once that they should be avoided. I am wondering if this is just a what people say as a general rule of thumb in the software development business. and that the nature of game programming produces such dependencies. // Foo #include <Bar.hpp> class Foo { bar& m_bar; }; and // Bar class Foo; class Bar { Foo* m_foo; }; I do this alot in Ruby, but dynamic languages are more forgiving in this instance, where as static ones, not so much.

    Read the article

  • Oracle Exalogic Elastic Cloud - Planned Webcasts

    - by chuck.speaks
    I’m putting together a collection of recorded webcasts around Oracle Exalogic Elastic Cloud (Exalogic).  The plan is to do a systems overview and then multiple deep dives into hardware and software components that make up the engineered system. Those of you that are members of our partner community (Oracle Partner Network), drop me a note if you are interested in a full blown in-class delivery via PTS resources.  There is no schedule for these workshops but if there is enough interest, I would venture to guess it would roll out soon. Those of you with applications certified on Oracle WebLogic server that would like to scale to Exalogic, see me or watch this space.   Chuck Speaks chuck <dot> speaks at oracle <dot> com

    Read the article

  • TechEd 2010 Day 1 Recap

    Weve teamed up with Habitat For Humanity to build a new home for a deserving family in New Orleans. The wall raising ceremony for the home will be this Friday, June 11th. If youre at TechEd and would like to help then come by the DevExpress booth on Aisle 2600 at Booth 13 and let us know. Day 1 is almost over here at TechEd 2010 in New Orleans and it was a busy day. Microsoft has combined the IT and Software crowd so there are plenty of attendees. The first day was filled with plenty of questions...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Google I/O 2010 - Building your own Google Wave provider

    Google I/O 2010 - Building your own Google Wave provider Google I/O 2010 - Open source Google Wave: Building your own wave provider Wave 101 Dan Peterson, Jochen Bekmann, JD Zamfirescu Pereira, David LaPalomento (Novell) Learn how to build your own wave service. Google is open sourcing the lion's share of the code that went into creating Google Wave to help bootstrap a network of federated providers. This talk will discuss the state of the reference implementation: the software architecture, how you can plug it into your own use cases -- and how you can contribute to the code and definition of the underlying specification. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 8 0 ratings Time: 59:03 More in Science & Technology

    Read the article

  • Bluetooth is always on after waking from hibernate

    - by Ali
    I have a problem that whenever my machine, Lenovo W520, wakes from hibernate the bluetooth starts. It does not matter what the situation was before hibernate. After hibernate the bluetooth is always on. Even in bluetooth settings it says that the bluetooth is off, but I can clearly see that the bluetooth LED is on. I would like to know what the problem is. Is it a hardware problem or software? How can I fix it?

    Read the article

  • Imitating Exchange Server's "RBAC AuthZ" in my own application... (is there something similar?)

    - by makerofthings7
    Exchange 2010 has a delegation model where groups of winrm cmdlets are essentally grouped into roles, and the roles assigned to a user. (Image source) This is a great & flexible model considering how I can leverage all the benefits of PowerShell, while using the right low level technologies (WCF, SOAP etc), and requiring no additional software on the client side. (Image source) Question(s) Is there a way for me to leverage Exchange's delegation model in my .NET application? Has anyone attempted to imitate this model? If I must start from scratch, how would I go about imitating this approach?

    Read the article

  • Which tool to use for "home banking"?

    - by Huygens
    I would like to manage my bank accounts in a secure manner on Ubuntu. I saw several applications in the Software Centre, but I don't know which one to choose. I don't need fancy features like stock options. I just have regular accounts which I want to follow, I don't want complicated stuff. As bank data are quite sensitive, I would highly prefer an application that does encryption of the data. Though, if you have a really cool app but it does not have this feature, as long as it offers to store the data in one dedicated place, I could do with encrypting that place. So what tool do you use that could fit my needs?

    Read the article

  • JCP Awards 10 Year Retrospective

    - by Heather VanCura
    As we celebrate 10 years of JCP Program Award recognition in 2012,  take a look back in the Retrospective article covering the history of the JCP awards.  Most recently, the JCP awards were  celebrated at JavaOne Latin America in Brazil, where SouJava was presented the JCP Member of the Year Award for 2012 (won jointly with the London Java Community) for their contributions and launch of the Global Adopt-a-JSR Program. This is also a good time to honor the JCP Award Nominees and Winners who have been designated as Star Spec Leads.  Spec Leads are key to the Java Community Process (JCP) program. Without them, none of the Java Specification Requests (JSRs) would have begun, much less completed and become implemented in shipping products.  Nominations for 2012 Start Spec Leads are now open until 31 December. The Star Spec Lead program recognizes Spec Leads who have repeatedly proven their merit by producing high quality specifications, establishing best practices, and mentoring others. The point of such honor is to endorse the good work that they do, showcase their methods for other Spec Leads to emulate, and motivate other JCP program members and participants to get involved in the JCP program. Ed Burns – A Star Spec Lead for 2009, Ed first got involved with the JCP program when he became co-Spec Lead of JSR 127, JavaServer Faces (JSF), a role he has continued through JSF 1.2 and now JSF 2.0, which is JSR 314. Linda DeMichiel – Linda thus involved in the JCP program from its very early days. She has been the Spec Lead on at least three JSRs and an EC member for another three. She holds a Ph.D. in Computer Science from Stanford University. Gavin King – Nominated as a JCP Outstanding Spec Lead for 2010, for his work with JSR 299. His endorsement said, “He was not only able to work through disputes and objections to the evolving programming model, but he resolved them into solutions that were more technically sound, and which gained support of its pundits.” Mike Milikich –  Nominated for his work on Java Micro Edition (ME) standards, implementations, tools, and Technology Compatibility Kits (TCKs), Mike was a 2009 Star Spec Lead for JSR 271, Mobile Information Device Profile 3. David Nuescheler – Serving as the CTO for Day Software, acquired by Adobe Systems, David has been a key player in the growth of the company’s global content management solution. In 2002, he became Spec Lead for JSR 170, Content Repository for Java Technology API, continuing for the subsequent version, JSR 283. Bill Shannon – A well-respected name in the Java community, Bill came to Oracle from Sun as a Distinguished Engineer and is still performing at full speed as Spec Lead for JSR 342, Java EE 7,  as an alternate EC member, and hands-on problem solver for the Java community as a whole. Jim Van Peursem – Jim holds a PhD in Computer Engineering. He was part of the Motorola team that worked with Sun labs on the Spotless VM that became the KVM. From within Motorola, Jim has been responsible for many aspects of Java technology deployment, from an independent Connected Limited Device Configuration (CLDC) and Mobile Information Device Profile (MIDP) implementations, to handset development, to working with the industry in defining many related standards. Participation in the JCP Program goes well beyond technical proficiency. The JCP Awards Program is an attempt to say “Thank You” to all of the JCP members, Expert Group Members, Spec Leads, and EC members who give their time to contribute to the evolution of Java technology.

    Read the article

  • How to Configure Microsoft Security Essentials

    Microsoft Security Essentials is the software giant's free solution for home users as well as small businesses. As long as you have a genuine copy of Windows running on your PC, you can enjoy all it has to offer. The program is characterized by easy installation and a user interface that is intuitive and rather simple to navigate. With so many viruses, spyware, and other malicious items floating all around the Web, keeping your PC secure should be of utmost importance. After all, you want to protect your investment and your sanity at the same time. Having a solid program such as Microsof...

    Read the article

  • cannot format a fat filesystem, getting error "Both FATs appear to be corrupt. Giving up."

    - by Nilesh
    i am trying format an FAT (or FAT32) file system on my ubuntu system, but i am not able to format the device, each time i am getting the error "Both FATs appear to be corrupt. Giving up." i have tried all options like sudo dosfsck -t -a -w /dev/sdc1 sudo dosfsck -w -r -l -a -v -t /dev/sdc1 but each time the same message comes, can any one guide me how to recover the filesystem, also i don't mind losing data of this drive as this is an external pen drive. Also, can u pl suggest of some method other then booting from a CD with software like GPARTED or something like that.

    Read the article

  • New A-Team Web Site Launched

    - by .raja
    The A-Team has launched a new web site – the A-Team Chronicles which aggregates and organizes content produced by The A-Team members (including your humble blogger). The A-Team is a central, outbound, highly technical team comprised of Enterprise Architects, Solution Specialists and Software Engineers within the Fusion Middleware Product Development Organization that works with customers and partners, world wide, providing guidance on implementation best practices, architecture, troubleshooting and how best to use Oracle products to solve customer business needs. This content captures best practices, tips and tricks and guidance that the A-Team members gain from real-world experiences, working with customers and partners on implementation projects, through Architecture reviews, issue resolution and more. A-Team Chronicles makes this content available, through short and to the point articles to all our customers and partners in a consistent, easy to find and organized way. If you like the articles we post here, you might find even more interesting articles at the new A-Team Chronicles site, covering a wider range of Fusion Middleware topics. We will be decommissioning this site shortly in favor of A-Team Chronicles site and all new contents will be posted there.

    Read the article

  • Ok it has been pointed out to me

    - by Ratman21
    That it seems my blog is more of poor me or pity me or I deserve a job blog.   Hmmm I wont say, I have not wined here as I have used this blog to vent my frustration on the whole out of work thing (lack of money, self worth, family issues and the never end bills coming my way) but, it was also me trying to reach to others in the same boat as well as advertising, hay I am out here, employers.   It was also said, that I don’t have any thing listed here on me, like a cover letter or resume. Well there is but, it was so many months and post ago. Also what I had posted is not current. So here is my most current cover and resume.   Scott L Newman 45219 Dutton Way Callahan, Fl. 32011 To Whom It May Concern: I am really interested in the IT vacancie that you have listed for your company. Maybe I don’t have all the qualifications you want (hold on don’t hit delete yet) yet! But maybe I do, as I have over 20 + years experience in "IT” RIGHT NOW.   Read the rest of my cover and my resume. You will see what my “IT” skills are and it will Show that I can to this work! I can bring to your company along with my, can do attitude, a broad range of skills, including: Certified CompTIA A+, Security+  and Network+ Technician §         2.5 years (NOC) Network experience on large Cisco based Wan – UK to Austria §         20 years experience MIS/DP – Yes I can do IBM mainframes and Tandem  non-stops too §         18 years experience as technical Help Desk support – panicking users, no problem §         18 years experience with PC/Server based system, intranet and internet systems §         10+ years experienced on: Microsoft Office, Windows XP and Data Network Fundamentals (YES I do windows) §         Strong trouble shooting skills for software, hard ware and circuit issues (and I can tell you what kind of horrors I had to face on all of them). §         Very experienced on working with customers on problems – again panicking users, no problem §         Working experience with Remote Access (VPN/SecurID) – I didn’t just study them I worked on/with them §         Skilled in getting info for and creating documentation for Operation procedures (I don’t just wait for them to give it to me I go out and get it. Waiting for info on working applications is, well dumb) Multiple software languages (Hey I have done some programming) And much more experiences in “IT” (Mortgage, stocks and financial information systems experience and have worked “IT” in a hospital) Can multitask, also have ability to adapt to change and learn quickly. (once was put in charge of a system that I had not worked with for over two years. Talk about having to relearn and adapt to changes but, I did it.) I would welcome the opportunity to further discuss this position with you. If you have questions or would like to schedule an interview, please contact me by phone at 904-879-4880 or on my cell 352-356-0945 or by e-mail at [email protected] or leave a message on my web site (http://beingscottnewman.webs.com/). I have enclosed/attached my resume for your review and I look forward to hearing from you.   Thank you for taking a moment to consider my cover letter and resume. I appreciate how busy you are. Sincerely, Scott L. Newman    Scott L. Newman 45219 Dutton Way, Callahan, FL 32011? H (904)879-4880 C (352)356-0945 ? [email protected] Web - http://beingscottnewman.webs.com/                                                       ______                                                                                       OBJECTIVE To obtain a Network Operation or Helpdesk position.     PROFILE Information Technology Professional with 20+ years of experience. Volunteer website creator and back-up sound technician at True Faith Christian Fellowship. CompTIA A+, Network+ and Security+ Certified.   TECHNICAL AND PROFESSIONAL SKILLS   §         Technical Support §         Frame Relay §         Microsoft Office Suite §         Inventory Management §         ISDN §         Windows NT/98/XP §         Client/Vendor Relations §         CICS §         Cisco Routers/Switches §         Networking/Administration §         RPG §         Helpdesk §         Website Design/Dev./Management §         Assembler §         Visio §         Programming §         COBOL IV §               EDUCATION ? New HorizonsComputerLearningCenter, Jacksonville, Florida – CompTIA A+, Security+ and Network+ Certified.             Currently working on CCNA Certification ?MottCommunity College, Flint, Michigan – Associates Degree - Data Processing and General Education ? Currently studying Japanese     PROFESSIONAL             TrueFaithChristianFellowshipChurch – Callahan, FL, October 2009 – Present Web site Tech ·        Web site Creator/tech, back up song leader and back up sound technician. Note church web site is (http://ambassadorsforjesuschrist.webs.com/) U.S. Census (temp employee) Feb. 23 to March 8, 2010 ·        Enumerator for NassauCounty   ThomasCreekBaptistChurch – Callahan, FL,     June 2008 – September 2009 Churchsound and video technician      ·        sound and video technician           Fidelity National Information Services ? Jacksonville, FL ? February 01, 2005 to October 28, 2008 Client Server Dev/Analyst I ·        Monitored Multiple Debit Card sites, Check Authorization customers and the Card Auth system (AuthNet) for problems with the sites, connections, servers (on our LAN) and/or applications ·        Night (NOC) Network operator for a large Wide Area Network (WAN) ·        Monitored Multiple Check Authorization customers for problems with circuits, routers and applications ·        Resolved circuit and/or router issues or assist circuit carrier in resolving issue ·        Resolved application problems or assist application support in resolution ·        Liaison between customer and application support ·        Maintained and updated the NetOps Operation procedures Guide ·        Kept the listing of equipment on the raised floor updated ·        Involved in the training of all Night Check and Card server operation operators ·        FNIS acquired Certegy in 2005. Was one of 3 kept on.   Certegy ? St.Pete, FL ? August 31, 2003 to February 1, 2005 Senior NetOps Operator(FNIS acquired Certegy in 2005 all of above jobs/skills were same as listed in FNIS) ·        Converting Documentation to Adobe format ·        Sole trainer of day/night shift System Management Center operators (SMC) ·        Equifax spun off Card/Check Dept. as Certegy. Certegy terminated contract with EDS. One of six in the whole IT dept that was kept on.   EDS  (Certegy Account) ? St.Pete, FL ? July 1, 1999 to August 31, 2003 Senior NetOps Operator ·        Equifax outsourced the NetOps dept. to EDS in 1999. ·        Same job skills as listed above for FNIS.   Equifax ? St.Pete&Tampa, FL ? January 1, 1991 to July 1, 1999 NetOps/Tandem Operator ·        All of the above for FNIS, except for circuit and router issues ·        Operated, monitored and trouble shot Tandem mainframe and servers on LAN ·        Supported in the operation of the Print, Tape and Microfiche rooms ·        Equifax acquired TelaCredit in 1991.   TelaCredit ? Tampa, FL ? June 28, 1989 to January 1, 1991 Tandem Operator ·        Operated and monitored Tandem Non-stop systems for Card and Check Auths ·        Operated multiple high-speed Laser printers and Microfiche printers ·        Mounted, filed and maintained 18 reel-to-reel mainframe tape drives, cartridges tape drives and tape library.

    Read the article

  • How important is to sacriface your free time for accomplishing goals? [closed]

    - by Darf Zon
    I was reading a book about XP programming and about agile teams. While I was reading, I saw this scenario. I've never worked with a development team (just in school). So I would like what do you opine on this situation: Your boss has asked you to deliver software in a time that can only be possible to meet the project team asking if you want to work overtime without pay. All team members have young children. Discuss whether it should accept this request from your boss or should persuade the team to give their time to the organization rather than their families. What could be significant factors in the decision? As a programmer, you are offered an upgrade as project manager, but his feeling is that you can have a more effective contribution in a technical role in one administrative. Write when you should accept that promotion. Somethimes, I sacrifice my free time for accomplishing hits at work, so it's very important to me to know your opinion base of your experience.

    Read the article

  • Building TrueCrypt on Ubuntu 13.10

    - by linuxubuntu
    With the whole NSA thing people tried to re-build identically looking binaries to the ones which truecrypt.org provides, but didn't succeed. So some think they might be compiled with back-doors which are not in the source code. - So how compile on the latest Ubuntu version (I'm using UbuntuGNOME but that shouldn't matter)? I tried some tutorials for previous Ubuntu versions but they seem not to work any-more? edit: https://madiba.encs.concordia.ca/~x_decarn/truecrypt-binaries-analysis/ Now you might think "ok, we don't need to build", but: To build he used closed-source software and there are proof-of-concepts where a compromised compiler still put backdoors into the binary: 1. source without backdoors 2. binary identically to the reference-binary 3. binary contains still backdoors

    Read the article

  • Who are the outspoken critics of Object-Oriented design?

    - by Xepoch
    Sure, object-oriented techniques are great and have stuck around for a while. I know only less than a handful of critics of the OO principles. It seems as though most non-OO designs and architectures are shunned, yet we continue to write a lot of good software in C and solve a lot of data changes via awk/sed and countless other examples. Correct tool for the correct job, yes? I'm having a hard time finding articles, presentations, or published criticisms of OO (even Fred Brooks has blessed information hiding). Are there any well-known, published and/or outspoken critics of OO?

    Read the article

  • Good questions to ask a potential new boss?

    - by David Johnstone
    I first asked this question on Stack Overflow, but it turns out this is a better place for it. Imagine you were working as a software developer. Imagine that the manager of your team leaves and your company is looking for a replacement. Imagine that as part of the hiring process you had the opportunity to talk with him. You are not the only person doing an interview, and while it is not ultimately your decision whether or not to hire him, you do have an influence. What questions would you ask? What would you talk with him about?

    Read the article

  • What makes a good developer / system documentation?

    - by deamon
    Much time is wasted to get new developers started with existing software systems, because there is no good documenation. But what makes a system documentation good? One thing is a good API documentation like the Java API doc, but how to transfer the "bigger picture" and other things that cannot be placed in the API doc? One constraint is that it should not be to hard and time consuming to write the docs, because that is one reason why it is omitted so often. So, what makes documentation good?

    Read the article

  • Gilbane Conference San Francisco 2010

    I attended the Gilbane Conference San Francisco 2010 today and did a short presentation on: Open Source Tools That are Changing the Content Technology Landscape Open Source tools are dramatically changing our perceptions of software and how we invest in tools for content creation, management and delivery. Open source tools are created more cheaply by a broad team of developers, but also may require strong support organizations to make them work, so they are never free. This session will examine...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How do you Install the Latest Release of Miro?

    - by Brenton Horne
    In the software centre the latest release of Miro available is 4.0.4 whereas the latest release of Miro is 5.0.4. How do I download 5.0.4 on 12.10? I have tried following the guide at http://www.getmiro.com/download/for-ubuntu/ (and thus have already run sudo add-apt-repository ppa:pcf/miro-releases) but it failed and when I tried to run sudo apt-get update I received the error: W: Failed to fetch http://ppa.launchpad.net/pcf/miro-releases/ubuntu/dists/quantal/main/source/Sources 404 Not Found W: Failed to fetch http://ppa.launchpad.net/pcf/miro-releases/ubuntu/dists/quantal/main/binary-i386/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

< Previous Page | 434 435 436 437 438 439 440 441 442 443 444 445  | Next Page >