Search Results

Search found 26592 results on 1064 pages for 'information presentation'.

Page 612/1064 | < Previous Page | 608 609 610 611 612 613 614 615 616 617 618 619  | Next Page >

  • Technicolor TG582n with external DHCP server [on hold]

    - by Jack
    We have a small home setup with a Technicolor TG582n on Plusnet ISP. We have a Samba4 DC with DNS forwarding enabled. The DC forwards to the Technicolor. However, the client machines have their DNS settings manually set. This is an annoyance when using laptops on other networks. We would like to have DHCP handled on the server machine, such that when a client connects to the Technicolor, it gets its IP and DNS information from the DHCP server, eliminating the need to manually set adapter DNS settings. However, I cannot find an option to disable DHCP on the Technicolor and am not completely clear on how one would point DHCP services to the server from the Technicolor if there were the option. So, how would one make the Technicolor use an external server for DHCP leases?

    Read the article

  • Custom Key Flexfield (KFF) in Oracle Applications

    - by Manoj Madhusoodanan
    In this blog I will explain how to create a custom KFF.I am using XXCUST_KFF_DEMO table to capture the KFF code combinations. Following steps needs to perform. 1) Register the XXCUST_KFF_DEMO table.Click here to see the code. Verify the table has created successfully. Navigation: Application Developer > Application > Database > Table 2) Register the Key Flexfield. Navigation : Application Developer > Flexfield > Key Flexfields 3) Define the structure and segments.  Navigation:  Application Developer > Flexfield > Key Flexfield Segments Click on Segments button. Save the created Information.Check the Allow Dynamic Inserts check box if you want to create the combination from the KFF display window. Once you complete all the changes check the Freeze Flexfield Definition check box. 4) Create a sequence XXCUST_KFF_DEMO_S. 5) Try to create KFF item through OAF or Forms. Here I am using a page based on table XXCUST_KFF_TRN. You can see the output below.

    Read the article

  • Routing Essentials

    - by zharvey
    I'm a programmer trying to fill a big hole in my understanding of networking basics. I've been reading a good book (Networking Bible by Sosinki) but I have been finding that there is a lot of "assumed" information contained, where terms/concepts are thrown at the reader without a proper introduction to them. I understand that a "route" is a path through a network. But I am struggling with visualizing some routing-based concepts. Namely: How do routes actually manifest themselves in the hardware? Are they just a list of IP addresses that get computed at the network layer, and then executed by the transport? What kind of data exists in a so-caleld routing table? Is a routing-table just the mechanism for holding these lists of IP address (read above)? What are the performance pros/cons for having a static route, as opposed to a dynamic route?

    Read the article

  • Speeding up Connection Between Computer and Wireless/Bridged Router

    - by Justian Meyer
    Hey everyone, I looked through other questions, but didn't find useful responses. Our main computer has a dl speed of 6 Mbps, but some of our other computers are getting only 40-200 Kb! The router is wireless, but all computers are connected using a Netgear Wall-Plugged Bridge XE102, which transmits information via the building's powerline. It can't be the hardware itself, however, because some computers still manage decent speeds. The computers afflicted are running on Microsoft XP Service Packs 2 and 3, but so are computers that are totally functional. These speeds severely impede on productivity and are excruciatingly frustrating when trying to cram in time in the early hours. Could it be an issue with the computer? Location? Router? Many thanks in advance, Justian

    Read the article

  • CLI-Based monitoring tool for KVM

    - by Pinnacle
    I am developing a scheduler for running VMs on KVM. The scheduling has over-commitment of resources like memory and CPU. For this, I need a CLI-based monitoring tool that keeps me giving information about the resource usage of each VM, because it might be the case that due to over-provisioning of resources, VMs on a particular host are running very slowly depending on the benchmarks/programs each VM is running, and then I need to migrate a VM to another host and so on. I looked into libvirt-based tools like collects, MUNIN, Nagios-vert, etc.( http://libvirt.org/apps.html#monitoring ) I also looked into Ubuntu utility perf-kvm ( http://manpages.ubuntu.com/manpages/maverick/man1/perf-kvm.1.html ) I want to ask which CLI-based would be recommended by the community so that I can make a automated scheduler that takes care of the above situation.

    Read the article

  • Best scripting language for project [on hold]

    - by Dave
    This is a subjective question, but I don't know where else to ask it. I'd appreciate it if someone could direct me to an appropriate scripting language for my project. I'm a little new at this so I'd appreciate any help. The project is a website that will display a list of photo subject groups (such as "nature" "people" "sports" etc) on the home page. The photos will all be in subdirectories of the main photo directory (photos) and each subject group will represent a subdirectory in photos. For example in directory photos there might be 3 subdirectories, "nature" "people" "sports" and in each of those subdirectories there will be the actual photos. The idea is that when the website owner wants to update/add/delete a subject group all he has to do is add, delete or update a subdirectory of the photos directory. This means, I think, that I need a scripting language that can read the directories and files in the website and then send a web page with the information in it. What is the simplest and easiest scripting language to do this in? Any ideas? Thanks

    Read the article

  • Master Data Management – A Foundation for Big Data Analysis

    - by Manouj Tahiliani
    While Master Data Management has crossed the proverbial chasm and is on its way to becoming mainstream, businesses are being hammered by a new megatrend called Big Data. Big Data is characterized by massive volumes, its high frequency, the variety of less structured data sources such as email, sensors, smart meters, social networks, and Weblogs, and the need to analyze vast amounts of data to determine value to improve upon management decisions. Businesses that have embraced MDM to get a single, enriched and unified view of Master data by resolving semantic discrepancies and augmenting the explicit master data information from within the enterprise with implicit data from outside the enterprise like social profiles will have a leg up in embracing Big Data solutions. This is especially true for large and medium-sized businesses in industries like Retail, Communications, Financial Services, etc that would find it very challenging to get comprehensive analytical coverage and derive long-term success without resolving the limitations of the heterogeneous topology that leads to disparate, fragmented and incomplete master data. For analytical success from Big Data or in other words ROI from Big Data Investments, businesses need to acquire, organize and analyze the deluge of data to make better decisions. There will need to be a coexistence of structured and unstructured data and to maintain a tight link between the two to extract maximum insights. MDM is the catalyst that helps maintain that tight linkage by providing an understanding about the identity, characteristics of Persons, Companies, Products, Suppliers, etc. associated with the Big Data and thereby help accelerate ROI. In my next post I will discuss about patterns for co-existing Big Data Solutions and MDM. Feel free to provide comments and thoughts on above as well as Integration or Architectural patterns.

    Read the article

  • Why is the network speed on my Mac Pro (early 2009) so slow?

    - by Rafael
    I have a really weird networking issue on my Mac Pro (Early 2009). I can’t get higher network speeds than about 2MBit/s. It doesn’t matter if this is over AirPort or one of the Ethernet ports. An iMac and a Mac mini in the same network with almost the same configuration get about 25-30 MBit/s. I’ve read a couple of things about this on the official Apple forums, but there is no helpful information. Anyone else with Mac Pro network speed issues and who knows how to solve them?

    Read the article

  • AllSparkCube Packs 4,096 LEDs into a Giant Computer Controlled Display

    - by Jason Fitzpatrick
    LED matrix cubes are nothing new, but this 16x16x16 monster towers over the tiny 4x4x4 desktop variety. Check out the video to see it in action. Sound warning: the music starts off very loud and bass-filled; we’d recommend turning down the speakers if you’re watching from your cube. So what compels someone to build a giant LED cube driven by over a dozen Arduino shields? If you’re the employees at Adaptive Computing, you do it to dazzles crowds and show off your organizational skills: Every time I talk about the All Spark Cube people ask “so what does it do?” The features of the All Spark are the reason it was built and sponsored by Adaptive Computing. The Cube was built to catch peoples’ attention and to demonstrate how Adaptive can take a chaotic mess and inject order, structure and efficiency. We wrote several examples of how the All Spark Cube can demonstrate the effectiveness of a complex data center. If you’re interested in building a monster of your own, hit up the link below for more information, schematics, and videos. How Hackers Can Disguise Malicious Programs With Fake File Extensions Can Dust Actually Damage My Computer? What To Do If You Get a Virus on Your Computer

    Read the article

  • Ensure Payroll Success with PeopleSoft Year-End Training for U.S. and Canada

    - by Breanne Cooley
    Year-end payroll processing and reporting is a requirement for your business. If you're responsible for completing these processes in either Canada or the United States using the PeopleSoft Payroll application, and if you're new to PeopleSoft Payroll or to performing these processes, consider enrolling in Oracle University's expert training. Our PeopleSoft Payroll specialists will guide you through the necessary steps to ensure you can smoothly and successfully perform your job. Training is specific to the country for which you are performing the processing and reporting. Training lasts one day and is delivered in our Live Virtual Class Format, which helps you avoid travel during this busy season. Here's the training we recommend: PeopleSoft Year-End Payroll - U.S. This course teaches you how to complete U.S. year-end processing and reporting using PeopleSoft Payroll for North America, step-by-step. Update tax reporting setup tables and update employees' income and tax records. Load each employee's year-end data into a single year-end record for processing and reporting.  Identify reports needed to reconcile the year-end data. Correct tax balances and other data as necessary. Generate final print and online W-2 forms and prepare the electronic file for the Social Security Administration.  Enter corrected W-2 information and print a W-2c form. Report periodic retirement distributions and related tax withholding amounts on form 1099-R.   Please Note: this course is intended for organizations using PeopleSoft release 8.81 or higher. PeopleSoft Year-End Payroll – Canada This course covers the steps necessary to perform Canadian year-end processing using Oracle's PeopleSoft Payroll for North America. Explore adjustments, balances, year-end slip processing, common pitfalls and errors and balancing reports.  Produce accurate year-end reporting results such as T4, T4A, RL-1 and RL-2.  Please Note: this course is intended for organizations using PeopleSoft release 8.81 or higher. See you in class! -Oracle University Marketing Team 

    Read the article

  • Parity Initialization after putting in two new disks

    - by lbanz
    All my firmware is up to date on the server and the controllers. Storage crashed over the weekend. I rebooted it and it detected that I put in two new disks last week (I did check that both disk completed the rebuilding process last week). After it booted into the OS I see that it gave me an information message. After 18 hours it is at 54% so it is looking healthy. But I need to replace 5 more disk in the msa. Should I wait for this message to finish before replacing more disks? 785 Background parity initialization is currently queued or in progress on Logical Drive 1 (15.0 TB, RAID 5). If background parity initialization is queued, it will start when I/O is performed on the drive. When background parity initialization completes, the performance of the logical drive will improve.

    Read the article

  • How can I guess if a USB cable will power my devices?

    - by rsanchez
    I've had problems with one long (4 meter) USB Mini-B to USB Type-A cable not being able to boot a 2.5'' external hard disc due to not supplying enough current. On top of that, the cable used a Type-A to Mini-B adapter for the Mini-B part, which probably made things worse. Three different shorter cables I got around made the hard disk work without extra current, so it was definitively the cable's fault. However, if I plugged the hard disk to the power, and used the long cable just for data it worked. Here is some related information on powering through USB cables: http://www.girr.org/mac_stuff/usb_stuff.html I have not any long cables that don't have an intermediary Type-A to Mini-B adapter to try them out. My question is: is there a way to guess if a cable will provide enough power for charge/disk drive power? Is it related to the length of the cable, to the build quality of the cable, or the fact that uses intermediary adapters?

    Read the article

  • How do you communicate improvements in tools and process to the development team?

    - by birryree
    Hi everyone, My team does a lot of internal tooling and infrastructure work - you can think of us as a small scale version of the teams Facebook, Etsy, Netflix, etc. who build all the infrastructure for scaling their services up to thousands/tens of thousands of servers and supporting millions of users. Lately, we've been running full steam ahead improving much of the tools we use internally, like tools for automatically creating new servers, setting up new application instances, etc. An end result of this has been decreased developer frustration, but increased 'ignorance' by most of the developer team about how to use our tools correctly and effectively. More often than not, my team will be asked by other teams to help them use the tools. Solutions we've thought up or things already in place: All our code is relatively simple and self-explanatory, with good comments where necessary, so developers could read the scripts. Counterargument: You can guess this isn't a particularly good idea, having people read our tools' code to figure out how to use it. All our code is committed to Subversion with very detailed commit messages about changes, developers could read the commit emails. Counterargument: Expect the developers to read all our commits? Ludicrous. Wiki - we have an internal company wiki, that we try to maintain with up to date information, but as we are moving so fast, the wiki has to keep pace as well. Counterargument: As mentioned, we move fast in my team, as more improvements on our tools are added daily. Again still relies on people to read something that might change constantly. Email the team? We could email the team when we have a glut of improvements to communicate. So as you can all see, we are trying to find new ideas, and explore options we haven't thought of yet. Anyone else ever been in a similar situation and have some guidance?

    Read the article

  • OS X Terminal using default ANSI colours for every theme

    - by FrogBot
    For some reason, every theme I palette I've had installed for the OS X Terminal.app has reverted to using the default ANSI colours. The themes have been working fine up until this point and I can't seem to determine what could have caused them to revert in this way. For reference, my standard Solarized Dark colour scheme should look like this … … but it currently looks like this: A quick look in the preferences panel shows that all the colours are correct save for the ANSI colours which have reverted to their defaults. I don't know what other information would be helpful but if you need any other info to help me troubleshoot just ask and I'll update as quickly as I can.

    Read the article

  • Systemd can't start script?

    - by TokyoMEWS
    I have a BASH-script I want to run on start up. My system is running systemd so I created a .service file with whith what I think is the neccessary information: [Unit] Description=My Script After=network.target [Service] ExecStart=/home/myscript.sh [Install] WantedBy=multi-user.target I used systemctl enable to 'register' it an rebooted. On boot I was told my script would be executed, but I could neither see any of the messages ECHO should display on screen nor did it write something to a file, according to what I had written in the script. Additionally, It does not start the application it's supposed to start. Systemctl status tells me that the script has run and exited successfully. Still, the script has no effect. If I run the script from a shell it works perfectly fine. Do any of you know what could be my problem?

    Read the article

  • Benefits of Server-side Coding

    There are numerous advantages to server scripting languages over client side languages in regards to creating web sites that are more compelling compared to a standard static site. Server side scripting are scripts that are executed on a web server during the compilation of data to return to a client. These scripts allow developers to modify the content that is being sent to the user prior to the return of the data to the user as well as store information about the user. In addition, server side scripts allow for a controllable environment in which they can be executed. This cannot be said for client side languages because the developer cannot control the users’ environment compared to a web server. Some users may turn off client scripts, some may be only allow limited access on the system and others may be able to gain full control of the environment.  I have been developing web applications for over 9 years, and I have used server side languages for most of the applications I have built.  Here is a list of common things I have developed with server side scripts. List of Common Generic Functionality Send Email FTP Files Security/ Access Control Encryption URL rewriting Data Access Data Creation I/O Access The one important feature server side languages will help me with on my website is Data Access because my component will be backed with a SQL server database. I believe that form validation is one instance where I might see server-side scripts and JavaScript used interchangeably because it does not matter how or where the data is validated as long as the data that gets inserted is valid. However, I would have to say that my personal experience would have to sway me in deciding what type of languages to use for form validation because they both have advantages and disadvantages based on the each situation.

    Read the article

  • What's the best way to manage error logging for exceptions?

    - by Peter Boughton
    Introduction If an error occurs on a website or system, it is of course useful to log it, and show the user a polite message with a reference code for the error. And if you have lots of systems, you don't want this information dotted around - it is good to have a single centralised place for it. At the simplest level, all that's needed is an incrementing id and a serialized dump of the error details. (And possibly the "centralised place" being an email inbox.) At the other end of the spectrum is perhaps a fully normalised database that also allows you to press a button and see a graph of errors per day, or identifying what the most common type of error on system X is, whether server A has more database connection errors than server B, and so on. What I'm referring to here is logging code-level errors/exceptions by a remote system - not "human-based" issue tracking, such as done with Jira,Trac,etc. Questions I'm looking for thoughts from developers who have used this type of system, specifically with regards to: What are essential features you couldn't do without? What are good to have features that really save you time? What features might seem a good idea, but aren't actually that useful? For example, I'd say a "show duplicates" function that identifies multiple occurrence of an error (without worrying about 'unimportant' details that might differ) is pretty essential. A button to "create an issue in [Jira/etc] for this error" sounds like a good time-saver. Just to re-iterate, what I'm after is practical experiences from people that have used such systems, preferably backed-up with why a feature is awesome/terrible. (If you're going to theorise anyway, at the very least mark your answer as such.)

    Read the article

  • Enterprise Manager Extensibility Exchange – Version 1.1 Now Available!

    - by Joe Diemer
    Since its announcement at Oracle OpenWorld 2012, the Enterprise Manager Extensibility Exchange is becoming the source to access Enterprise Manager entities, including plug-ins, connectors, deployment procedures, assemblies, templates, and more.  Based on feedback, the Exchange has recently been updated so Enterprise Manager administrators can find and access Oracle and partner-built plug-ins and connectors easier. The Exchange enables anyone to contribute an Enterprise Manager entity through the “Contribute” tab, where information about the entity is captured and placed on the Exchange once it is approved.  The Exchange encourages comment through the Enterprise Manager Forum.  An Oracle partner can build a plug-in by accessing the Extensibility Development Kit (EDK) found at the Development Resources tab.  Oracle partners and customers can can also engage a partner that has built its practice specializing in plug-in development and deployment.  One of those partners is Blue Medora, which has effectively used the EDK to build plug-ins to manage non-Oracle targets.  Next week Blue Medora will be a "Guest Blogger" and tell a great story about heterogeneous datacenter management.Partners can also have their plug-ins validated through the Oracle Validated Integration (OVI) program.  NetApp is an example of a partner that recently built an Enterprise Manager plug-in and has validated it through the program.  Check back here in two weeks for their blog post describing the value of an Enterprise Manager "OVI" plug-in as well as discuss specifics the NetApp storage plug-in.  Check out the NetApp Enterprise Manager Validated Integration datasheet in the meantime. The Enterprise Manager Exchange is located at http://www.oracle.com/goto/EMExtensibility. Stay Connected: Twitter |  Facebook |  YouTube |  Linkedin |  Newsletter

    Read the article

  • Ubuntu Server 10.10 vs. Fedora Server 14 for Mono.NET app hosting in VM

    - by Abbas
    Ubuntu Server 10.10 vs. Fedora Server 14 I want to create a web-server running Mono, MySQL 5.5 and OpenLDAP running as a VM (on VMWare Workstation). Searching “Ubuntu Server vs. Fedora Server” mostly yields flame wars and noise. There are a few good articles available but they are either out-of-date or don’t offer very convincing arguments. I know the answer is most likely to be “it depends” but I wanted to harness the collective wisdom on ServerFault and get opinions, experiences and factual information to the extent possible. My selection criteria would be (other than what is mentioned above): Ease of use Ease of development Reliability Security

    Read the article

  • SOA Composite Sensors : Good Practice

    - by angelo.santagata
    I was discussing a interesting design problem with a colleague of mine Niall (his blog) on the topic of how to cancel an inflight SOA Composite process.  Obviously one way to do this is to cancel the process from enterprise Manager ( http://hostort/em ) , however we were thinking this isnt a “user friendly” way of doing this.. If you look at Nialls blog you’ll see he’s highlighted a number of different APIs which enable you the ability to manipulate the SCA instance, e.g. Code Snippet to purge (delete) an instance How to determine the instanceId from a composite_sensor_value using the “composite_sensor_value” table How to determine a BPEL Process status using the cube_instance table   Now all of these require that you know the instanceId of your SOA Composite, how does one find this out? Well the easiest way of doing this is to create a composite sensor on the SCA component. A composite sensor is simply a way of publishing a piece of business data as part of your composite. The magic here is that you can later query composites based on this value. So a good best practice is that for any composites you create consider publishing a composite sensor value using a primary key of some sort , e.g. orderId, that way if you need to manipulate/query composites you can easily look up the instanceId using the sensorid.   For information on how to create a composite Sensor id see this documentation link  

    Read the article

  • Vim - dynamic list of open buffers in a window

    - by asfallows
    I've investigated a few ways to maintain a list of open buffers in Vim, and some of them are very good, but none of them seem to behave the way I'd like. It's very possible that nothing like what I want exists, and it can't hurt to ask. I've been using NERDTree in GVim, and I like the idea of putting the information in a slender left-hand window. I've put together a handy diagram for how I'd like my environment to look: |--------|---------------------------------------| | | | | | | |NERDTree| Windows | | | | | | | | | | |--------| | | | | | | | | List | | | of | | | Open | | | Buffers| | | | | | | | |--------|---------------------------------------| So my question is: Is there a vim-native or plugin-enabled way to maintain a list of currently open buffers and select/edit/close from that list, inside a window similar to NERDTree? I understand that this approach may be incongruous with the Vim way of doing things, and if you feel like I'm missing something about how to manage multiple files in a Vim session, please leave a comment with suggestions!

    Read the article

  • Need to run a .sh as root on boot or login

    - by Graymayre
    Still new with linux and running ubuntu 12.10 I have a wireless stick (ae2500) which has known issues that has been partially solved using ndiswrapper. However, to use it I must run the same scripts every time I reboot, effectively uninstalling and reinstalling the driver. I made a .sh file to run every time to make it easy, but I must do the sudo login everytime. There are three solutions I am looking for and although not all are necessary to solve this particular problem, I would still like to know them all for learning purposes. run scripts or file.sh on boot (as well as other programs) run scripts or file.sh automatically with root privileges make the install permanent so as not to have to go through the process every time. Any additional information that can help me regarding this that I did not think to ask (including streamlining my commands), or general knowledge, would be greatly appreciated. Following are the contents of the file. I pretty much just made it as I would have entered it. cd ~/ndiswrapper-1.58rc1 sudo modprobe -rf ndiswrapper sudo rm /etc/modprobe.d/ndiswrapper.conf sudo rm -r /etc/ndiswrapper/* sudo depmod -a sudo make uninstall sudo make sudo make install sudo ndiswrapper -i bcmwlhigh5.inf ndiswrapper -l sudo modprobe ndiswrapper

    Read the article

  • .NET Framework 4.0 Targeting Pack does not show in Visual Studio

    - by balexandre
    How can I install the .NET 4.0 Framework on Windows 8 Pro / Visual Studio 2012 Professional? I get this: and if I follow the link of Install other frameworks... I get into Microsoft page where I find this information: I have then installed .NET Framework 4.0.1 Targeting Pack and .NET Framework 4.0.2 Targeting Pack as I can't install 4.0.3, restarted the machine over an over, but Visual Studio continues not to show the framework on the dropdown menu. What am I doing wrong? Here is what regedit says what I have installed on my machine:

    Read the article

  • Book Review: Professional ASP.Net MVC4

    - by Sam Abraham
    The past few weeks have been particularly busy as I continue to dedicate a bigger portion of my free time to refreshing my memory and enhancing my knowledge of best practices pertaining to technologies we plan on using for a major upcoming project. In this blog post, I will be providing a brief overview of my latest reading “Professional ASP.Net MVC4” by Jon Galloway, Phil Haack, Brad Wilson and K. Scott Allen. This book is a must read for web developers looking to enhance their MVC expertise with best practices and tips shared from recognized industry experts. This book takes the reader on a 16-chapter long journey towards being a better ASP.NET MVC developer with chapter 16 putting all information covered in practical context by dissecting the implementation of Nuget.org, a real-life open-source, ASP.NET MVC project.  All code samples referenced in this book are conveniently accessible via NuGet, a free, open-source Library package manager that installs as a Visual Studio Extension. Chapters 2, 3 and 4 thoroughly cover MVC’s various components: Controllers “C”, Views “V” and Models “M” respectively. Chapter 5 covers additional extension methods (Helpers) provided to speed and ease the use of common HTML elements such as forms, textboxes, grids, to name a few… Chapter 6 tackles built-in validation while providing examples and use cases on implementing custom validation that plugs into the MVC framework. Chapters 7 thru 13 discusses the latest on Membership, Ajax, Routing, NuGet and the ASP.Net Web API. Chapters 12 (Dependency Injection) and 13 (Unit Testing) demonstrate a big competitive advantage of MVC with its ease of test-ability and plug-ability. Chapters 14 and 15 targets the advanced developer showcasing how to extend MVC to customize and replace every piece in the framework.In conclusion, I strongly recommend Professional ASP.NET MVC 4 as an excellent read for both developers already using MVC as well as those getting started with the framework.   Many thanks to the Wiley/Wrox User Group Program for their support of our West Palm Beach Developers’ Group.  You can access my reviews of books I recently read: Professional ASP.NET Design Patterns Professional WCF 4.0 Inside Windows Communication Foundation Inside Microsoft SQL Server 2008 series

    Read the article

  • Word document has very strange "hidden" formating after converting from PDF to .docx

    - by Celeritas
    I have a PDF document with my resume which I need to edit. I used this service to convert it to doc. I opened it in Word 2010 and saved it as .docx. There are some bizzare problems where there's empty space and if you try to delete it text gets shifted into vertical columns. How can I fix this? I'm afraid this document has a lot of private information and I can't just fill in dummy text, then the formatting gets even more messed up :/ Otherwise I'd post screen shots.

    Read the article

< Previous Page | 608 609 610 611 612 613 614 615 616 617 618 619  | Next Page >