Přidat serverWidgetNabídněte RSS zdrojPřihlásitRegistrovat

zpravodajství ze světa linuxu

Minutes of the Board meeting of November 12th, 2013

28.11.2013 03:45  wiki: https://wiki.gnome.org/FoundationBoard/Minutes/20131112 = Minutes for Tuesday, November 12th, 2013, 16:00 UTC = == Next Meeting == * Tuesday, November 26th, 2013, 16:00 UTC == Attending == * Marina Zhurakhinskaya * Emmanuele Bassi * Tobias Mueller * Ekaterina Gerasimova * Joanmarie Diggs === Regrets === * Andreas Nilsson * Sriram Ramkrishna * Karen Sandler === Missing === * Rosanna Yuen == Board Meeting Agenda == * Implement a Foundation employee policy for COLA * At the face to face board meeting we discussed having employee salaries increase when inflation increases, as other comparable nonprofits do. * http://data.bls.gov/timeseries/CWUR0000SA0?output_view=pct_12mths * Blocking on the budget * We may have enough coverage for the next 5 years, but we are missing actual figures * GUADEC and hackfest may be over budget * Income from adboard fees should be able to cover the needed amount * Decision should be deferred until we have a proper budget * Reimbursements * GIMP * Last GIMP reimbursements were 6 months late, we must do much better than that * The treasurer should be kept into the loop with a reminder every month * Michael Schumacher sent a series of ideas for improving the process * Joanie has been working on a series of ideas of her own on the wiki * '''ACTION''': Joanie to go over Michael Schumacher ideas on the Gimp funding processes and merge them with her ideas on the wiki * GUADEC * Sindhu has not received hers, can Rosanna verify that it has not been returned? * '''ACTION''': Rosanna to check if the reimbursement for Sindhu has been returned * The account number was missing from the payment instructions. How can problems like this be avoided? * Updating the GNUcash file should make these things apparent in the future * Scheduling an IRC Foundation meeting * '''ACTION''': Emmanuele to send a proposal to foundation-list about possible IRC meetings * GNOME Asia Summit update * Co-hosting with the SUSE conference is not on the table any more * But we may be able to co-host with the Fedora conference in Asia * Behnam Esfahbod volunteering with the Foundation * Should we get some form of report? * Either email to the board, or blog on Planet GNOME * Every couple of weeks would be adequate * '''ACTION''': Marina to ask Behnam to report on his state and work * Budget * Karen/Marina report on OPW income and expenses for next year * Could the Foundation sponsor 2/3 interns? * We should get around 30 interns in total for the last 2013 round * Kat needs up-to-date accounts/gnucash file * budget is 6 weeks late * World of GNOME Forums hosting * '''ACTION''': Joanie to ask Karen to report on board-list about the World of GNOME Forums * Handling Pitivi funds * '''ACTION''': Joanie to ask Karen to report on board-list about handling the Pitivi funds * Upcoming Advisory Board Agenda * '''ACTION''': Joanie to ask Karen to report on board-list about the adboard meeting status * Post-release teams meeting * From the minutes of the September 3rd board meeting * At the Board Q&A one of the proposals was to have a regularly scheduled meeting between the various teams involved in the release process * An effort similar to the reports for the AGM and the annual reports, but for the benefit of collaboration between teams * Emmanuele proposed to have a "post-mortem" after the development cycle ends * Conference call or IRC meeting between teams discussing what went right and what went wrong during the cycle * Maybe blue sky planning for the next one * Emmanuele is contacting various members of the release, design, engagement, a11y, i18n, and docs teams * Seems that the idea is getting buy-in * '''ACTION''': Emmanuele to draft a date and time for the post-release meeting == Completed Actions == * Rosanna to set up an account for Kat to access the Foundation's PayPal * Joanie will ping the Release Team again for the Foundation IRC meeting after the 3.10 release is completed * Joanie and Kat to draft a new travel assistance form to replace the current one == Pending action items == * Andreas to put together a GNOME supporter card for our donors * Emmanuele to send Zana and Tobi an update on FoG donors gifts * Emmanuele to send hackers up for "adoption" the list of donors that want a postcard * Emmanuele to send a proposal to foundation-list about possible IRC meetings * Emmanuele to draft a date and time for the post-release meeting * Joanie to draft a proposal for the photography policy at GNOME conferences to discuss on foundation-list * Joanie and Karen to meet and discuss about the photography policy for GNOME events * Joanie to go over Michael Schumacher ideas on the Gimp funding processes and merge them with her ideas on the wiki * Joanie to ask Karen to report on board-list about the World of GNOME Forums * Joanie to ask Karen to report on board-list about the adboard meeting status * Kat to create a private wiki page on the web services accounts holders and passwords * Karen to write the Privacy policy for GNOME services * Karen to contact Tor, EFF, OTI for feedback and eventual funding for privacy work * Karen to "call for ideas" for privacy work related bids at GUADEC and by blog/email to foundation-announce * Karen and Tobi to continue pursuing the fund collection in Europe * Karen to circulate a new draft of the OPW mentor/student contracts * Marina to ask Behnam to report on his state and work * Rosanna to check if the reimbursement for Sindhu has been returned * Sri to investigate better uses of adwords on the GNOME websites * Tobi to talk to Andrea to move the PayPal data extraction scripts over to the GNOME infrastructure

GNOME 3.11.2

23.11.2013 16:15 The second release of the GNOME 3.12 development cycle is here. See for the new features that have been proposed for this cycle. To compile GNOME 3.11.2, you can use the jhbuild modulesets . Note that gnome-shell-wayland requires new clutter and cogl releases that did not happen in time for this release. You can use the 1.18 branches of these modules or wait for their next releases. gnome-shell under X works fine with the latest 1.16 clutter and cogl releases. https://wiki.gnome.org/ThreePointEleven/Features/ http://library.gnome.org/devel/jhbuild/ http://download.gnome.org/teams/releng/3.11.2/ The release notes that describe the changes between 3.11.1 and 3.11.2 are available. Go read them to learn what's new in this release: core - http://download.gnome.org/core/3.11/3.11.2/NEWS apps - http://download.gnome.org/apps/3.11/3.11.2/NEWS The GNOME 3.11.2 release itself is available here: core sources - http://download.gnome.org/core/3.11/3.11.2 apps sources - http://download.gnome.org/apps/3.11/3.11.2 WARNING! WARNING! WARNING! -------------------------- This release is a snapshot of early development code. Although it is buildable and usable, it is primarily intended for testing and hacking purposes. GNOME uses odd minor version numbers to indicate development status. For more information about 3.11, the full schedule, the official module lists and the proposed module lists, please see our colorful 3.11 page: http://www.gnome.org/start/unstable For a quick overview of the GNOME schedule, please see: http://live.gnome.org/Schedule Enjoy, Matthias Clasen GNOME Release Team

Nákup a prodej Bitcoin

GNOME Mobile hits the road

21.11.2013 15:30 

Unix, Despite It’s Age, Still Has Followers

20.11.2013 07:00  Operating systems, graphics cards and processors all have combined to improve the workstation and give shape to the market. Although these factors are conspiring to generate a considerable shift towards the Windows NT or personal workstation, the traditional Unix workstation still has its loyal followers. According to figures released last year by research firm IDC Canada Ltd., NT workstation shipments, between 1998 and 2003, are projected to have a compound annual growth rate of 15 per cent. For the same period, traditional workstations are expected to decline by three per cent each year. Despite the Unix workstation’s decline, it will still find a home in many niche areas and among companies that have made large purchases in the past and are reluctant to switch, said IDC. “It’s still doing very well in the scientific area, as well as in large-scale manufacturing,” says Alan Freedman, research manager for servers and workstations at IDC. “It’s the organizations with the huge installed base that haven’t made the transition, while the organizations that are small or more agile are moving towards NT.” Traditional Unix workstations are still found in departments devoted to engineering, mapping, geology and other technical applications. “The Unix market is not shrivelling up or fading away,” Freedman says. “But what we’re seeing now is that some of the mechanical and electrical design areas that were wholeheartedly Unix are now at least taking a look at the NT workstations.” The reason they are looking at personal workstations has a lot to do with lower prices, increasing operating system reliability and the advances in processor and graphics technology. Independent software vendors have responded by porting many of their applications to Windows NT. Backing up the capabilities of the personal workstation are improvements on the processor front. Of particular importance are Streaming SSMD Extensions, an innovation from Intel Corp. of Santa Clara, Calif. Similar to the MMX innovation, SSE gives the Pentium III the ability to better perform the floating point calculations needed for high-end graphics calculations. Coinciding with the advances in processing are low-cost graphics cards which ease entry into the world of high-end graphics work. In the past, the customer had to spend $3,000 or more to get a graphics card with an on-board geometry accelerator but now there are cards that can do this for less than half that amount. Able to leverage the power of on-board processors, users get graphic performance that scales with their processing performance. One of the prominent applications of workstations’ high-end graphics capabilities is in Geographical Information System applications. Some of the major GIS vendors, such as ESRI Inc. of Redlands, Calif., are porting many of their products to the Windows NT operating system. Higher processing power, in conjunction with the latest graphics cards, allow for a more dynamic presentation of geographic information. The newer graphics cards also allow workstation users to use two monitors on a single workstation. Graphics cards that make this possible include the Millennium G400 Series from Matrox Graphic Inc. of Montreal. Based on what Matrox calls the “DualHead Display,” the feature allows the user to extend one application across two monitors, or open multiple applications at once. Some users, the company says, display applications on one monitor while showing tool bars on the other. Six months ago desktop PCs were equipped with 4MB video cards. Now users are getting 8MB or 16MB. But while this is pretty powerful for the desktop level, hardware isn’t necessarily the only means of defining a workstation, argues Kevin Knox, a senior research analyst with the Gartner Group in Stamford, Conn. “I think workstations are defined more by the application than they are by the hardware,” he says. “Generally, workstations are systems optimized for a specific vertical application. It’s not just high-end, mid-range and low-end. “I agree that the lines are blurred and there are vendors out there that play to that. I think their workstation numbers are inflated significantly because they are showing workstations in the desktop market.” “The high-end market is flat to a small decline in terms of revenues, and a larger decline in terms of units because of NT,” says IDC’s Freedman. “However, some companies are coming down with lower-priced Unix workstations to combat that — most notably Sun Microsystems with workstations that are lower in price and target the same markets as the NT workstations.” “So while Unix does not have the majority of the units, it does have the lion’s share of the revenue,” says Freedman. “We are predicting over the next four or five years, slight negative growth in units and a bit higher negative growth in revenue — about two or three per cent.” Gartner Group reports that NT will eventually supersede Unix in the high-end market. The Unix versus NT operating system game has been playing for some time now, and vendors, which at one time clearly chose sides, no longer seem as sure of the winning team. Not too long ago, the workstation market consisted of Sun, HP, IBM and SGI, but there has been a rapid penetration of Wintel systems, says Knox. “Sun is trying to protect its installed base, and frankly not doing very well on the low end,” he says. “They introduced the Darwin product and that really hasn’t taken off as I know they wish it had.” What users are saying, he continues, is they have an office productivity machine for the everyday applications, and a Unix box, and they want to consolidate them into a single system. “Right now that’s the NT system,” adds Knox. He expects traditional PC vendors such as Compaq and Dell to take the lead in market share because of the improved performance of NT, Xeon processors and other technologies. There are, however, still some markets that can only be served, at this point, by the robustness Unix delivers. Traditionally, high-end workstation markets have included mechanical computer-aided design and electronic design in industries as diverse as auto, finance and software design. Changes in workstation market The rise of the personal workstation has dramatically changed the face of the workstation market in Canada — at least in terms of vendors. In 1997, according to IDC, Hewlett-Packard Co. was the leading vendor with more than 14,000 units shipped in that year. Second was Sun Microsystems Inc. with approximately 8,000 units shipped. Following Sun were IBM, Digital, Compaq, Dell and Silicon Graphics. Since that time, the Windows NT/personal workstation market has been growing at 15 per cent compound annual growth while the Unix market has been declining by a three per cent annual growth rate. Trends for both camps are expected to continue until 2003. In 1999, 19,500 workstations were shipped in Canada. as much as 32.6 per cent of the market is now held by Dell. Compaq follows at 23.7 per cent, then Hewlett-Packard at 21.6 per cent, followed by IBM at 14.7 per cent. Other workstations account for the remaining 7.4 per cent of the market, IDC Canada reports. Risc machines no longer dominate Three years ago, the workstation market was dominated by RISC processor-based products running Unix operating systems and applications. Today, several developments in this marketplace have allowed advanced application users to rely on other processors to provide comparable performance to a traditional workstation at a lower price. A workstation-class system is a higher- performance computer specifically engineered for applications with more demanding processing, video and data requirements intended for professional users who need exceptional performance for computer-aided design , geographic information systems , digital content creation , computer animation, software development and financial analysis. With the introduction of Pentium II processors, many computer companies expanded their product lines to offer Intel based workstations. The added performance provided by these and successive Intel Pentium III and Pentium III Xeon processors have resulted in a strong shift from proprietary, traditional workstations to branded personal workstations, which use the Windows NT operating system. Workstation users benefit from rapidly evolving processor technology. High performance workstation-class systems let power users be more productive as projects can be completed much faster, saving organizations time and money. The workstation market has been one of the first to benefit from the set of instructions incorporated into Intel’s Pentium III processors, called Streaming SIMD Extensions . This performance improvement will come from the new SSE-enhanced applications and drivers being introduced by hardware and softwar vendors. Most branded workstations also provide the option to add a second processor, allowing users to takeadvantage of the multi-tasking and multi-threading capabilities of their applications and operating systems. In addition to dual processor support, workstation-class products are differentiated by their options for advanced Open GL graphics adapters, greater memory and storage expansion, higher performance harddrives and application certification. It is important to understand that all 3D graphics cards are not created equal. 3D video adapters can generally be categorized as those optimized for advanced workstation applications or those that are good for games. OpenGL support is the industry standard that separates the workstation from a gaming workstation. Most of the workstation graphics glory goes to the high-end 3D video cards, but multiple monitors are also an important productivity tool for many workstation users. Two or more monitors can be of benefit tp those who require more display space for increased efficiency and effectiveness while multi-tasking. For instance, multiple monitors can help software developers create and debug applications by having an application on one screen and a debugger on another, or a programming editor on one and an onlin reference manual on the other.

Fanatics? The War Was Won!

20.11.2013 07:00  I started out in 1994 as a Linux advocate, saying to myself, “This is great, but I wish it were easier to install and didn’t screw up my boot sector.” In 1995-1996, I forgot about it so that I could concentrate on applications. In 1997 I went into denial, and in 1998 I tried to remain objective. In 1999, I’m taking a “who cares” attitude. I’m not denying Linux per se; I’m simply refusing to get caught up in an OS holy war. It’s not 1999 yet, though, so I still have a little time left for some more denial–not just about Linux, but also about Windows 2000 and NetWare. Linux fanatics out there, you’re going to have to get over this: In some aspects, Windows NT is a better operating system. The biggest NT advantage is that it has a development model and a ton of rich consumer-friendly applications. Were that the only thing, Linux would be home free, since the mass acceptance of the operating system will spur on more applications. But Linux also has problems with its scheduler. I’ve written before that the Windows NT scheduler is not up to par to what is available on some Unix platforms . However, NT’s scheduler makes Linux’s look like dog meat. Another Linux problem is with I/O. An engineer I know says that Linux is rife with Ring 3 scaling problems. But he added that the operating system will “get there” soon enough. The trouble with NT starts with its registry, which most engineers complain is a horrible mess. The only people who like the NT Registry are those who sell packages to “fix” it. As bad as it is, the registry is the least of Microsoft’s worries. We have today a big need for 24-by-7 uptime, and NT just doesn’t cut it. NT doesn’t allow IT managers to gracefully kill rogue programs. It makes organizations reboot systems too often, even when minor, noncritical application-level changes are made. Sure, Microsoft and others have patches, kludges and fixes that let NT function in this environment. But corporations want guaranteed uptime; that’s why Linux is perfect here. NetWare 5.0 should have been poised to reap profits from a delay in Windows 2000 and the newness of Linux. Unfortunately, there are a ton of problems with NDS , including incompatibilities between NDS with NetWare 5.0 and NDS with NetWare 4.0 implementations. There are also unconfirmed reports that NetWare 5.0 is slower than NetWare 4.0 in some instances. The performance problem stems from NetWare’s unithreaded TCP/IP stack. But really, these performance differences are so slight that it shouldn’t really make a big difference. All this hand wringing is meaningless in a way. We in the press and in the community constantly operate in an “exclusive OR” world. That is, if something new comes along, we have to assume it will displace something else. But the buying practices of corporations rarely function in this way. Corporations buy to solve problems. That’s why I see businesses forcing vendors to work together. The consumers will push Microsoft to accept Linux; they’ll push for development of stronger NT development APIs on the Linux kernel. They’ll push Microsoft to accept NDS because consumers don’t plan to dump it. Next year, though, Linux will be pushing other Unix vendors out of the market. The smartest move Novell could make would be to completely dump the NetWare code base and move all of the NetWare services to Linux. Caldera supports NetWare for Linux now. Watch out for Caldera, by the way. In 1999, it’s going to make some Linux announcements that will knock your socks off.

Down With NT! Up With Linux!

20.11.2013 07:00  This could be described as the year of the operating system. Microsoft Corp. is racing to finish Windows 2000 before 2000, IBM and Santa Cruz Operation are forging a new version of Unix for Intel 64-bit processors and Compaq, Sun and Hewlett-Packard are toughening their proprietary Unix versions. For an IS manager it’s an embarrassment of riches – and a time of confusion. The expected lock-down of systems starting this summer due to Year 2000 concerns may give users a few months of breathing room and time to compile a list of questions. Among them: Is it worth waiting for the much-delayed Win2000 or should we stick to NT Server 4.0? Will Win2000 be as reliable as Unix? Will the Data Centre edition be worthy of a data centre? And, as always, is Windows’ alleged cost of ownership advantage just window-dressing? “NT versus Unix? I get this question from clients two times a day,” says Tom Bittman, vice-president and research director at the Gartner Group in Stamford, Conn. With more mission-critical applications coming out, NT supporters want them added to the IT mix. But managers wonder if Microsoft is up to the task. “In almost all cases they’re leaning towards Unix,” said Bittman, “and wondering if they’re crazy.” But Bittman warned NT is “probably two years behind the hype.” “NT still has issues of scalability for the majority of the market, and that’s one reason many companies are going with Unix.” In fact, he added, Unix is undergoing a bit of a come-back. “We’re definitely seeing the start of a slow-down in the server space. A lot of companies are realizing Unix still has a place, and that NT is still missing some attributes. The belief is they may be fixed in Windows 2000. But while they’re waiting, that puts more of a focus on Unix for certain mission-critical and back-end applications.” Microsoft’s ambitions are high: the Data Centre edition will scale up to 16 processors and 64 GB of memory, said Erik Moll, Windows 2000 marketing manager for Microsoft Canada. The Advanced Server edition will go up to four-way symmetric multi-processing. Both will have advanced clustering and load-balancing capabilities. Other features will include the Active Directory, a management service with file-level encryption and support for Public Key Infrastructure, and Web and Internet support services. “There’s no question as we move forward to Windows 2000 one of our key focuses will be on reliability and availability,” said Moll. “We’ve greatly reduced the number of reboots as you do reconfigurations of your system, and we also worked hard to make sure device drivers have gone through compatibility testing.” Until Win2000 is proven, however, users are looking at NT. For Bittman, Unix has it over NT for scalability, mixed workloads, high availability and reliability, maturity of applications and the availability of people skilled enough to run mission-critical deployments. Expect more than 200 concurrent users on your system? Don’t use NT, he advises, unless perhaps you’ve got SAP R/3. Gartner has seen SAP implementations with 850 users, not big by Unix standards, but impressive considering most applications on NT can’t handle more than 200 users. It speaks to how well SAP has been tuned for Windows, said Bittman. However, he added, the majority of the market would need about 400 users, and that’s where Win2000 is headed. As scalability declines as an issue, high availability and reliability will become more important, said Bittman. Microsoft’s “Wolfpack” clustering technology is two years behind Unix, said Bittman, but will be better in Win2000. “Our biggest concern is NT reliability,” he said, “and it’s not getting better. In fact, I have clients who tell me in their view, every release of NT has only gotten worse. I don’t think that’s true, but there’s that perception out there.” The biggest issue is memory leaks. Microsoft acknowledges the problem, but most will be plugged only in Win2000. Meanwhile, Gartner tells clients to expect an NT system will go down at least once a week. For companies who can tolerate that kind of performance, he said, NT will be good enough. However, Ritchie Leslie, director of Montreal-based DMR Consulting Group Inc.’s Microsoft strategic alliance, believes NT has a big place in the enterprise. “For medium-scale applications, say transaction systems supporting a few hundred users, Windows environments have a clear total cost of ownership advantage over Unix environments,” he said. “NT is cheaper to buy, most organizations have NT servers so if they can avoid having to buy a specialized Unix box to run an application, they can reduce the total cost.” There have been “huge advances” in NT management tools, he said, adding the operating system is catching up in stability and reliability. “For all but the very largest applications, Windows NT is becoming an increasingly practical platform,” he said. But, he added, it’s not ready for a large data warehouse or large transaction-based system. Bev Crone, who watches both platforms as general manager of midrange system sales for IBM Canada Ltd., acknowledges that NT/Win2000 use will continue to eat into the Unix market. “At first blush,” he said, Unix looks more expensive than Microsoft’s offering, but not when the user considers factors such as reliability, availability and scalability. Unix vendors aren’t standing still, he added, pointing to the IBM Unix collaboration on Project Monterey for Intel chips. But Bittman is skeptical. Santa Cruz, Calif.-based SCO Inc. hasn’t set sales records with UnixWare, its Intel offering, he said. Vendors are only now realizing that the first version of the IA-64 chip will create a small market, which won’t grow until the next generation of processors, called McKinley, debuts. While some analysts believe Microsoft is aiming for an October release of Win2000 Professional, Server Edition and Advanced Server, Bittman says that will only be for “bragging rights.” He expects it will come out next year and will be filled with bugs that weren’t caught during beta testing, because users are telling him they aren’t testing it with mission-critical apps. “We’re telling clients don’t deploy it in a production mode for at least six to nine months after general availability, after at least one service pack and maybe two,” Bittman said. “We’re also telling them it will be less reliable than NT 4 with service pack 5 for a year.”

Corporate Acceptance Was Completely Necessary For Linux

20.11.2013 07:00  For IT organizations such as West Virginia Network in Morgantown, which runs the network applications for the state’s higher education institutions on AIX Risc 6000, NT, and Intel platforms, “it would probably take a killer app to move us to Linux quickly,” says Jeff Brooks, lead systems programmer for WVNET. On the desktop side, however, he says Linux is taking hold. “We have an increasing number of people running a for personal productivity, including mainframe programmers. IT professionals who are not supporting PC products are spending too much time maintaining their PCs. We calculated the time people spent doing non-business, non-mission-related maintenance–doing upgrades, or rebooting when that blue screen comes up on NT– and it’s not non-trivial.” As an embracer of Linux, Java, and other things non-Microsoft, IBM, which dropped to Number 2 in the Software 500, grew its software revenue 6% to $11.9 billion, with total corporate revenue growing 4% to $81.7 billion. IBM, like other Top 10 companies Sun, HP, Compaq, and Hitachi, derives the bulk of its revenue from hardware sales. IBM’s strategy for making money from software and services is a model the others are also following, each in their own way, says IDC’s Bozman. “If you look at IBM you see a model for how you can make more from software and services, but more marketable, more cross-platform. Look at Lotus and Tivoli.” Like IBM, HP “realizes there is a synergy there in combining software sales and services.” Sun, she says, uses software more as a lever to drive hardware sales. And unlike IBM, “Sun has a small professional services organization and doesn’t want to be concerned about competing with the Andersens, etc.” Compaq entered the enterprise software fray with its acquisition of Digital Equipment, inheriting not only the software but Digital’s services organization. While Compaq so far has not been able to meld this acquisition as smoothly as it probably hoped, and recently replaced CEO Eckhard Pfeiffer with chairman and acting CEO Ben Rosen, analysts are positive about Compaq’s ability to move forward in the enterprise software world. “Compaq already has a lot of corporate IT accounts; they have extremely strong relations with the IT world. The ability to take DEC and integrate it is an obvious question mark, but you have to believe Compaq will make an important part of the future,” says Tim Bajarin, president, Creative Strategies Inc, San Jose, Calif. He adds, “Clearly they see enterprise-driven software as a key component to the overall value-added products they sell in the marketplace. The demand for complete systems solutions is on the rise, not the decline. I would be surprised if Compaq doesn’t get it right.” Also pushing hard in the one-stop shopping realm is Computer Associates , now busy absorbing its recently announced acquisition of Platinum Technology. CA in 1998 grew its software revenue 12% to almost $4.9 billion, with total corporate revenue growing to almost $5.1 billion. The combination of Platinum and CA, based on 1998 software revenue would be $5.6 billion, which would rank the combined company this year at #3, surpassing Oracle. Size seems to matter in the systems management market, as the industry consolidates into a smaller group of large suppliers. Collectively, companies in the Software 500 that compete in the systems/network management space grew software revenue an average of 16.2%. In the enterprise applications arena, where both Oracle and PeopleSoft compete, “the key priority for ERP vendors is to extend their applications and frameworks to the world of e-business,” says Steve Bonadio, senior analyst, Hurwitz Group, Framingham, Mass. ERP will be the backbone that enables companies to efficiently and effectively communicate and collaborate with themselves, their customers, partners, and suppliers.” Both Oracle and PeopleSoft grew at a good pace, with Oracle reporting a 20% growth in software revenue to $5.3 billion. And PeopleSoft hit the billion dollar mark in 1998, with its 48% increase in software revenue. Both companies beat the average software revenue growth rate for companies in the Software 500 that compete in the ERP/enterprise applications market. While the ERP suppliers are benefiting from a still-healthy demand for their solutions, corporate IT professionals still need to evaluate the health of their potential vendors, and the health of their strategy, says Hurwitz’s Bonadio. “As ERPvendors aggressively round of their product functionality and architectural strategies, companies using ERP applications need to make sure that their existing ERP investments are secure. There is nothing worse than spending millions and taking years to implement an ERP solution than to find out that you have to do it all over again.” ERP was not the only software segment thriving in 1998. Among the Software 500, suppliers in both the Internet/e-commerce and data warehouse/OLAP markets saw software revenues rise an average of l7.8%. Across all segments of the software industry, many companies grew through merger or acquisition, a trend that has continued as the industry matures. Among the Software 500, 32% merged with or acquired another company during 1998 . While investment banks tend to say this is largely a positive trend for IT buyers, assuring the continuance of new and innovative products from startups and small companies, and enabling more one-stop shopping and more formalized support organizations, IT professionals are not so sure. The supposed benefits of one-stop shopping “depend almost completely on the willingness of the vendor to try to fit our enterprise situation,” says WVNET’s Brooks. Some vendors, in his experience, “try very hard to lock you into multiyear contracts with little added value. Some conglomerates have offered us software they think is a great deal but that we don’t necessarily need. In that sense, the whole merger mania for us every year looks a little gloomier, because we have to deal with a smaller number of suppliers that give us heartburn. But then again, I’m not a stockholder.” In addition, says Brooks, “we tend to lose relationships with developers we may have worked with for 10 years. All of a sudden there’s a layer of management between us and them–if they’re still there.” Agreeing with Brooks is Lynn Manzano, Y2K project manager at Experian Inc., a leading supplier of information on consumers, businesses, motor vehicles, and property, in Orange, Calif. “You do lose a lot continuity, and you lose the depth of support.” On the other hand, she says, “Hopefully I’ll get a better price. With some of these bundled purchases we saved a lot of money, from a cost perspective. From a functionality perspective, I don’t know yet” the benefits from a merger or acquisition. Another overriding concern for IT in 1998 was the Y2K issue. 1998 was the year the software industry and the IT community got serious about Y2K, scrambling to address millennium date issues before year-end, so 1999 could be spent testing. Among the Software 500, 89% of the companies reported that their primary software products are now Y2K compliant. Only 1% of the companies said they will not be compliant by year-end ’99, nor will they make it by 2000. And 10% of the companies did not respond. The Y2K issue proved to be a double-edged sword for IT. On one hand, many organizations were forced to put off new development projects to concentrate on their millennium fix. On the other hand, Y2K has prompted a massive updating of legacy systems to, in many cases, new packaged applications. Says WVNET’s Brooks, “In 1998 we were largely sidetracked by Y2K. Every application we were working with from mid-year 1998 through now had been with an eye to getting everything done for Y2K. The fringe benefit is massive updating. I think, architecturally, that’s clearing out a lot of deadwood. It’s really a new broom–inadvertently.” And while it didn’t get quite the attention in the U.S. that the Y2K issue received, Europeans were busy grappling with the debut of the Euro currency, which observers say creates a more complex coding challenge than Y2K, as it affects fundamental business rules. Among the Software 500, which are predominantly U.S.-based companies, 50% reported that their primary software products are Euro compliant, while 6% said they are not compliant yet. Twenty-nine percent reported that Euro compliancy was not applicable to their software products, while 15% declined to answer. With the Euro now launched, and Y2K soon to be winding down, what’s ahead for 1999 and beyond? WVNET’s Brooks cites storage management as an area to watch–”the whole issue of integrating storage management across the enterprise, particularly the interoperability of storage and the prospects for data interchange on a rapid, secure basis.” Says Mark Gorenberg, a partner in the venture capital firm Hummer Winblad Venture Partners, San Francisco, Calif., “Lower cost and plentifulness of storage is a huge trend for people in IT.” Other trends, Gorenberg notes, include the management of the extended enterprise, the movement of ERP vendors to e-commerce, and the rise of vertical software markets. “Vertical markets are very much in vogue. It’s possible to create a vertical company as a standalone company now. Before, they couldn’t grow large enough.” For Brooks, he is looking to the new millennium to bring simplification. “It seems that over the 25 years I’ve been following the industry, complexity grows at a fairly good clip until people refuse to tolerate it, then something comes out to simplify it. It used to be the desktop environment was fairly complicated, then Windows came out, and pretty much everybody’s PC looked the same for a few years. There has been another one of those with the Web, but it’s not done yet. I have a feeling our architectural issues in about four years will look very different. But I’m not seeing anything from the pundits that satisfies me about what could be next big thing.” So is life e-beautiful for IT professionals today? Both WVNET’s Brooks and Experian’s Manzano agree there are lots of opportunities for IT professionals, both employment-wise and innovation-wise. And the software they have to work with keeps getting better. “The tools are significantly more compatible,” says Brooks. “You can have a toolbox and have some hope that most of them work together somewhat.” With more packaged products, he notes there are also fewer opportunities to provide solutions for users that the software vendors won’t, “but at the same time you can devote more time to doing other things, like developing new applications, or training, or doing production evaluation, and spend less time looking at code.” Gorenberg adds, “There are a number of huge opportunities in IT. Outsourcing has created real capitalism for people in IT. Outsourcing and the whole service provider phenomenon are growing like gangbusters. For the first time venture firms are funding service companies, and those companies are growing and going public, making great IT people the new rock stars.”

Linux Exploded For Several Reasons

20.11.2013 07:00  You can’t work in the information technology sector and not be touched by Linux in some way, even if it’s only in a debate about what role Linux might play in your organization. Linux is an Open Source operating system developed by a team of programmers lead by Linus Torvalds and originally targeted at the Intel x86 platform. Linux is a UNIX-like operating system, but since it was written completely from scratch, it uses no original UNIX source code. On the positive side, that means there are no copyright restrictions on the code. The biggest disadvantage, however, is that the operating system isn’t built upon a base of heavily used code. It also doesn’t have all the features you’d expect from a production UNIX system, like large-scale storage management. Red Hat recently introduced an Enterprise Edition that includes Computer Associates’ AcrServeIT. The tacit admission is there’s a 100 percent Open Source operating system that still leaves high-end users wanting. Linux began in the early 1990s and has moved from Intel to Motorola, Alpha, Sparc, and other major platforms. Because of its broad base of programmer support and its ability to run software from the large GNU software library, Linux is a viable alternative to both commercial UNIX servers, and Microsoft Windows and NT client machines. Ease of use–a common concern with UNIX–is less of a problem with two widely available desktop environments, KDE and GNOME, both of which are easy for Windows or Macintosh OS users and developers to learn. Both environments support a variety of themes that affect the appearance of windows, buttons, and scrollbars, letting users adopt a Windows or Macintosh look and feel. To be a serious contender in the e-business market, Linux needs to offer more than a pretty desktop, and it does. First, it’s becoming the operating system of choice for high-end server vendors, such as IBM, Silicon Graphics, and Compaq Computer. Other vendors, such as Dell Computer, offer Linux preconfigured on its servers. Customer support is available from hardware manufacturers, as well as from Linux distributors like Red Hat, Caldera, and SuSE. Also, Linux tools abound. The most popular Linux distributions include web and FTP servers, as well as other Internet applications such as Gopher, Domain Name Services , Mail, News, Proxy, and Search servers. The breadth of applications available with the widely supported, inexpensive operating system makes it an ideal candidate for web server applications. Industry analysts report more than 30 percent of web servers run on Linux. Apache: Web server of choice The Apache Web Server is a freely distributed HTTP server developed by the Apache Group and managed by the Apache Project . Evolving from the National Center for Supercomputing Applications web server developed at the University of Illinois, Apache has become the most popular web server. According to a Netcraft survey of more than 13 million web sites conducted in March 2000, , Apache is used by 60 percent of the respondents. The next most popular server, Microsoft IIS, came in at just under 21 percent. Popularity, though, is hardly the only consideration in judging e-business software. Scalability, reliability, and integration with other applications is crucial. Apache has been a stable platform on UNIX for some time, but Windows implementations haven’t proved as reliable. The Apache Group’s recently announced Apache 2.0 includes better support for non-UNIX platforms, which should improve Windows stability. On the performance side, Apache 2.0 supports a hybrid multiprocessor/multithreaded mode that promises to improve scalability, though more real-world use is required before we know how well it meets that promise. The Apache Software Foundation supports projects focused on integrating Apache into the larger e-business environment, including XML-Apache, Java-Apache, and Java Servlet and Java Server Page support. The mod perl project provides developers the tools to create Apache modules in Perl that eliminate the need to run CGI scripts in a separate process, thus avoiding costly process instantiation. Web servers also must provide access to middle-tier business services, especially those driven by database applications. Apache is recognized as a viable HTTP server by high-end web applications servers, such as the Oracle Application Server and IBM WebSphere. Like Linux, Apache’s wide market acceptance demonstrates that Open Source development can create an essential tool for e-business. The release of Apache 2.0 further shows that Open Source software can keep up with the demands of changing needs. However, while Apache is a solid choice for UNIX and Linux platforms, until a stable Windows version is available , Microsoft IIS is probably a better alternative for NT. Perl: Portable programming While C/C++ is a portable programming language, the learning curve and time required to develop fully-functioning applications is oftentimes prohibitive. While Java has eliminated some of the most problematic aspects of C–especially pointers–and offers a rich set of libraries like C/C++, development is still coding-intensive. The Perl programming language has emerged as the programming tool of choice by as many as one million developers, according to The Perl Journal. Perl was originally used as a systems administration tool on UNIX platforms, but was quickly adopted for Web development and data-intensive programming because it includes many features found in other tools, such as C, awk, sed, and BASIC. According to http://www.perl.com, Perl is the most popular web programming language, due in large part to its powerful text manipulation. From an e-business perspective, the fact that Perl runs on so many platforms and can handle operating system administration tasks, as well as more traditional database-oriented tasks, makes it an ideal candidate for a development tool. Since Perl applications can be written to run as embedded modules in Apache, web server processing can improve as much as 2,000 percent. Can Perl pass the muster in an e-business environment? Amazon.com and Deja.com are two of the major dot-coms that use Perl to run their sites. In addition to Apache integration, the nsapi_perl module embeds a Perl interpreter in a Netscape web server so e-businesses don’t have to trade the faster performance of an embedded interpreter if they don’t use Apache. The Perl language is also widely supported by third-party developers, offering more than 400 modules at the Comprehensive Perl Archive Network . Perl has been ported to major UNIX platforms, as well as Microsoft Windows and NT, and Macintosh. Also, Perl can easily integrate with databases through DBI modules, making it an ideal tool for creating applications from database-centric web pages to data warehouse extraction, transformation, and load scripts. While popular, the Perl syntax can be cryptic. This is understandable given its origin as a systems management- oriented tool, but this limits its use for large-scale software development. Python, an object-oriented scripting language, is a better choice if you’re looking at larger applications where reuse and object orientation will pay off. Databases: The weak link When we think of e-business and databases we generally think of the big names: Oracle, IBM DB2, Microsoft SQL Server, Sybase, and Informix. Occasionally, we’ll hear about the two most popular Open Source offerings–MySQL and Postgres–but not too often. Why not? Those offerings simply can’t compete with the features and functionality of today’s commercial relational database management systems. In some cases, the lack of features makes the Open Source databases unusable in an e-business environment. For example, MySQL’s lack of subqueries and right outer joins require developer work-around solutions. More seriously, the database’s lack of transaction support completely eliminates it as a serious contender for e-business. The transaction support found in major commercial offerings makes it possible to define a logical unit of work consisting of a series of steps that must all occur for the transaction to be completed. For example, transferring funds from your savings to your checking account may consist of two distinct steps–withdrawing money from savings and then depositing it into checking. Without a way to group those two steps, it would be possible for the withdrawal to be made without the corresponding deposit. That could occur if the server crashed in the middle of the operation, and no server is immune to crashing or other problems that could disrupt a transaction. PostgreSQL, originally developed at the University of California at Berkeley, has many of the features found in commercial databases systems, including transactions, stored procedures, and extensive SQL support. If you’re looking for an Open Source database, this is probably the best. However, you must remember there’s much more to a database than what meets the programmer. If you need to consider integration with enterprise resource planning systems, failure recovery, parallel query optimization, advanced partitioning options, support for data warehousing , then the commercial offerings are your best bet. Conclusion: No simple answer Can an e-business succeed with Open Source? Of course. TCP/IP and other core Internet protocols were developed in an Open Source environment. The key to success is to not blindly ignore or embrace Open Source any more than you would a particular vendor’s offering. Open Source offers strong programming tools, Web servers, and a popular operating system that’s making steady inroads into production environments. In terms of databases, like your Web server, the database must be reliable, scalable, and easily integrate with your other systems. If you stick to the most widely supported Open Source solutions, including Linux, Apache, and Perl–to name just three–you can build a stable, reliable e-business platform. Linux isn’t as established as UNIX, and it may not scale or promise the kinds of uptime you expect from UNIX, but for small- and mid-sized web sites, it may fit the bill. Apache and Perl both have strong developer support and in key areas, such as database access and performance, they get as much attention as any commercial product would. For real-world, large-scale e-business, Open Source mixed with commercial applications is the best approach. While making significant inroads, Open Source still can’t marshal the resources to keep up with changes in technology. Red Hat’s alliance with Computer Associates for high-end storage options is a case in point. While advanced storage applications and robust databases may emerge from Open Source development, there aren’t any now–and we need them now.

GNOME 3.10.2 Release

19.11.2013 01:45 Hello all, Here comes GNOME 3.10.2, the second update to GNOME 3.10, it includes many fixes, various improvements, and translation updates over 3.10.1, we hope you'll enjoy it. For more information about the major changes in GNOME 3.10, please visit our release notes: http://library.gnome.org/misc/release-notes/3.10/ ============================== Release Details and References ============================== The lists of updated modules and changes are available here: core - http://download.gnome.org/core/3.10/3.10.2/NEWS apps - http://download.gnome.org/apps/3.10/3.10.2/NEWS The source packages are available here: core - http://download.gnome.org/core/3.10/3.10.2/sources/ apps - http://download.gnome.org/apps/3.10/3.10.2/sources/ And if you want to compile GNOME 3.10.2 by yourself, you can use the jhbuild modulesets available here: http://download.gnome.org/teams/releng/3.10.2/

Webhosting Český hosting


Hledej-hosting.cz - webhosting, multihosting

hledej-hosting.png Přehled webhostingových a serverhostingových programů na českém trhu s možností jejich vyhledávání a porovnávání. Najděte si jednoduše vhodný hosting.

WEDOS sleva 50% na webhosting

Ušetřete za webhosting u Wedosu zadáním toho slevového kupónu při objednávce.

Slevový kupón:

Nejčtenější články

Uživatelé musí brát bezpečnost...

Root.cz - 07.02.2019 00:00 - čteno(2)

Linux používá čím dál více...

Root.cz - 05.02.2019 00:00 - čteno(2)

How to explain Kubernetes Operators...

Linuxtoday.com - 12.02.2019 20:00 - čteno(1)

LibreOffice 6.2 a 6.1.5 přicházejí

OpenOffice.org - 08.02.2019 12:16 - čteno(1)

Linuxový desktop před 20 lety: na...

Root.cz - 06.02.2019 11:08 - čteno(1)

LibreOffice 6.2 – změny v...

OpenOffice.org - 05.02.2019 15:23 - čteno(1)

Současné hrozby: hromadný sběr...

Root.cz - 04.02.2019 00:00 - čteno(1)

Intel Itanium končí, Fedora...

Root.cz - 03.02.2019 00:00 - čteno(1)

Fedora 28: firefox Security Update

LinuxSecurit... - 02.02.2019 03:36 - čteno(1)

Slackware: 2019-032-01: mariadb...

LinuxSecurit... - 02.02.2019 03:10 - čteno(1)

Proč používat LinuxCZIN.eu

Copyright © 2009 LINUXPORTAL.cz | Tvorba www stránek - Webnix.cz

Přidat server: 

Do kategorie:



Přidat novou kategorii